Straight talking 62
by Tim Anderson
Tim Anderson investigates the Internet of Things; and why Microsoft should share, share, share.
HardCopy Issue: 62 | Published: February 25, 2014
Among the buzzwords (or phrases) of the moment is the Internet of Things (IoT). Like many buzzwords, IoT is hard to define and has different shades of meaning depending on who you talk to. I was at CES in Las Vegas, source of all that is new, cool and excessive in consumer technology, and connected ‘things’ ran from Parrot’s MiniDrone to Kolibree’s Connected Toothbrush to the inevitable Internet fridge from Samsung. Such was the excitement around wearable technology that Sony, at its press conference, held up a small piece of white plastic, told us that it was the “core gadget” behind Sony SmartWear wearable entertainment, but nothing more, and considered it news. A section of the exhibition called iHealth featured gadgets that monitor your heart, temperature, blood pressure, exercise regime and more, feeding possibly too much information back to cloud-based services.
Is Microsoft learning to communicate again?
Remember the controversy surrounding the user interface changes in Visual Studio 2012? When the preview appeared, featuring near-monochrome icons that were hard to distinguish one from another, there was an outcry. Microsoft rushed to fix it before the release, making considerable improvements, and further tweaked it with the Blue theme that came out with Update 2 in May 2013.
In December 2013, Microsoft Project Manager Brian Harry made some frank comments about the incident in comments to an MSDN blog:
“The implementation of the new UI in 2012 was a mess. When I first saw it … I didn’t like it, and the internal feedback was consistent … there was a bit of a ‘cone of secrecy’ around the new UI because we didn’t want it ‘leaking’. Even I didn’t get to see it until months into it. … I think the biggest learning was – Don’t kid yourself into thinking you can do a ripple effect feature like that ‘on the cheap’. Another learning … is secrecy is bad – it lets problems fester until they become crises. Share, share, share. The feedback is critical to course correction.”
Now put this together with comments from former Microsoft employee Hal Berenson, writing on the question of whether Windows 8 is the new Vista. He identifies the key problem in the development process:
“The secrecy. The unwillingness to bounce things off customers early enough to make changes. A worship of schedule and process above wisdom and expertise, even if the result is the wrong thing shipped on the promised timeline.”
Adding in the comments:
“I hear incredible stories of the Windows team leadership just refusing to pay any attention to input from other parts of the company.”
If Windows 8 had won immediate market acceptance and made significant inroads into the tablet market dominated by iOS and Android, all would be forgiven. The fact is, Windows 8 has proved a hard sell, despite solid improvements in performance and a decent tablet UI. Windows 8.1 softened the changes somewhat, with boot to desktop and the reappearance of the Start button, suggesting that a softer initial approach, with more concession to what had gone before – maybe the ability to use Silverlight in the tablet personality, for example – would have helped adoption.
Harry’s remarks are encouraging though, because he felt able to share them, and because they may show a change in mood as Microsoft figures out what comes next for the Windows client. Microsoft has been burned before by sharing too much too early (three pillars of Longhorn anyone?), but for the last few years it has gone too far in the opposite direction. It may work for Apple, but for Microsoft with its partner-based ecosystem a little more sharing would be a good thing.
Samsung’s Smart Home concept promises to connect all your home appliances, not only the refrigerator but also lighting, air conditioning, washing machine, TV, cameras, and of course the robot vacuum cleaner, using its Smart Home Protocol, managed by a cloud-based server, and controlled using a smartphone or tablet.
IoT is not just about the kind of gadgets on show at CES. Consultants RedMonk put on an IoT conference in London a month before CES, called ThingMonk, and I noted down this comment from Moten Bagai, Director of Business Development at Heroku:
“We get to talk to the largest companies in the world, the CEOs that are viewing the Internet of Things as a massively transformational force in terms of how they run their business and how they connect to their customers. We are on the precipice of a major new wave in application development that is going to change what we understand when we talk about an application. Within the next two to three years, when we talk about an app of any reasonable complexity it’s going to have a device perspective.”
Hyperbole? Probably, especially as Bagai also admitted that “We’re not seeing a lot of demand yet,” attributing that fact to IoT being “an emergent area.”
Still, the fact that something is hyped excessively does not make it insignificant. The IoT buzzphrase embraces a number of areas:
- Embedded computing. In one sense there is nothing new in that. Back in 1991, Java was a project called Oak and intended to run on small consumer devices and smart remote controls. On the other hand, lower cost combined with ubiquitous connectivity means that the importance of embedded computing continues to grow.
- Connected devices. It is easier than ever to make a device connected. Low energy Bluetooth, for example, means a device can talk to a network with low power and little expense.
- Sensors. Many IoT examples boil down to a sensor or sensors feeding data back to a service. The value of that data is immense in areas like personal health, weather, pollution monitoring, temperature control, and many more.
- Tagging. IoT is not just about devices, but also ideas like RFID tags, which make objects machine readable without the manual steps it takes to read a barcode.
- Machine to Machine (M2M). Smart communication between devices, or between devices and services, that enable automation, data collection, or other applications.
- Wearable computing. This is may be just a special case of connected devices and sensors, but also refers to efforts to get many of the features of a SmartPhone, for example, into a form that is more convenient for the user. Google Glass is an obvious example, a computer embedded into a spectacle-like container. Smart watches are another possible approach.
So what does IoT mean for developers? The answer is partly about device-awareness: “In order to have a complete developer experience, we have to have a perspective that includes devices, how we receive their data, how we control them, how we deploy software to them,” said Bagai, talking about Heroku’s perspective. Heroku is a cloud application platform which supports a number of languages including Ruby, Node.js, Python and Java.
Bagai’s remarks apply more generally too. It pays for developers to have a device perspective on any application they build, even if it is for future use. That might just mean support for smartphones and tablets, but it might also mean thinking outside the box. What sort of data might make this application smarter and more useful? An example would be the addition of real-time traffic data to mapping and navigation apps, which greatly improves their usefulness.
Unfortunately there is a substantial roadblock to IoT development, and that is standards, or their lack, or their multiplicity. Like Samsung with its Smart Home Protocol (SHP), vendors like to do their own thing. At least Samsung is talking about opening up SHP to third-parties, which is not always the case.
Want to investigate IoT? While there may be a shortage of settled standards, there is a lot of fun to be had trying out development on commodity-priced boards like Raspberry Pi or Intel’s Arduino. On the software side there is Node-RED which integrates with both Pi and Arduino, and is built on Node.js to create “a visual tool for wiring the Internet of Things.” There is no better way to get a device perspective than to start prototyping with some actual hardware.