News: Microsoft to push cloud to edge

In recent years, there has been a strong push to move everything to a centralized cloud, enabled by virtualization and driven by the need to cut costs, reduce the time to market for new services, and increase flexibility. In recent years fog computing have discussed how location impacts performance, efficient use of network resources and subscriber experience. Physical distance inevitably increases latency. Therefore, the OpenFog Consortium was organized to develop a cross-industry approach to enabling end-to-end IoT deployments by creating a reference architecture to drive interoperability in connecting the edge and the cloud. The group has identified numerous IoT use cases that require edge computing including smart buildings, drone-based delivery services, real-time subsurface imaging, traffic congestion management and video surveillance. The group released a fog computing reference architecture in February 2017.

Now it seems that Microsoft is pursuing to push the cloud to the edge.

Microsoft has recently introduced IoT Edge and Azure Sphere, their first reference design that aims to combine the power of a new class of microcontroller (MCU) with built-in security, a purpose-built OS that is optimized for security and agility, and a cloud security service that guards the device. Azure Sphere is targeted towards the outer regions of the intelligent edge and is Microsoft’s push to serve the billions of MCU-powered devices that are built and deployed each year.

The Azure Sphere's three areas are: a secure Microcontroller unit, that by design and production will have an in-built security layer; an OS running an embedded Linux Kernel; a service provided by Azure brokering trust for device-to-device and device-to-cloud communication.

Today, Azure Sphere is in private preview and the first devices equipped with the secured OS and MCU is according to Microsoft expected to ship by the end of 2018.

More information:

Partner, Fredrik Svensson, Fredrik.svensson@glaze.se, +46 705 08 70 70

Positioning technologies currently applied across industries:

Global Navigational Satellite System: Outdoor positioning requires line-of-sight to satellites, e.g. GPS: the tracking device calculates its position from 4 satellites’ timing signals then transmits to receiving network
–    via local data network, e.g. wifi, proprietary Wide Area Network
–    via public/global data network, e.g. 3G/4G

Active RFID: A local wireless positioning infrastructure built on premises indoor or outdoor calculates the position based on Time of Flight from emitted signal & ID from the tracking device to at least 3 receivers or when passing through a portal. The network is operating in frequency areas such as 2.4 GHz WiFi, 868 MHz, 3.7 GHz (UWB – Ultra Wide Band), the former integrating with existing data network, the latter promising an impressive 0.3 m accuracy. Tracking devices are battery powered.

Passive RFID: Proximity tracking devices are passive tags detected and identified by a reader within close range. Example: Price tags with built-in RFID will set off an alarm if leaving the store. Numerous proprietary systems are on the market. NFC (Near Field Communications) signifies a system where the reader performs the identification by almost touching the tag.

Beacons: Bluetooth Low Energy (BLE) signals sent from a fixed position to a mobile device, which then roughly calculates its proximity based on the fading of the signal strength. For robotic vacuum cleaners an infrared light beacon can be used to guide the vehicle towards the charging station.

Dead Reckoning: Measure via incremental counting of driving wheels’ rotation and steering wheel’s angle. Small variations in sizes of wheel or slip of the surface may introduce an accumulated error, hence this method is often combined with other systems for obtaining an exact re-positioning reset.

Scan and draw map: Laser beam reflections are measured and used for calculating the perimeter of a room and objects. Used for instance when positioning fork-lifts in storage facilities.

Visual recognition: The most advanced degree of vision is required in fully autonomous vehicles using Laser/Radar (Lidar) for recognition of all kinds of object and obstructions. A much simpler method can be used for calculating a position indoor tracking printed 2D barcodes placed at regular intervals in a matrix across the ceiling. An upwards facing camera identifies each pattern and the skewed projection of the viewed angle.

Inertia: A relative movement detection likewise classical gyroscopes in aircrafts now miniaturised to be contained on a chip. From a known starting position and velocity this method measures acceleration as well as rotation in all 3 dimensions which describes any change in movement.

Magnetic field: a digital compass (on chip) can identify the orientation provided no other magnetic signals are causing distortion.

Mix and Improve: Multiple of the listed technologies supplement each other, well-proven or novel, each contributing to precision and robustness of the system. Set a fixpoint via portals or a visual reference to reset dead reckoning & relative movement; supplement satellite signal with known fixpoint: “real time kinematics” refines GPS accuracy to mere centimetres; combine Dead Reckoning and visual recognition of 2D barcodes in the ceiling.