Microsoft knows very well that in most cases, people want Artificial Intelligence to work with information as it is happening. It understands that virtual assistants need to respond within a few seconds, and smart security cameras need to send alerts while there are still intruders in sight. In fact, the company recently unveiled its very own hardware acceleration platform called Project Brainware. The platform promises fast, real-time AI in the cloud. It can crunch 39.5 teraflops of machine learning tasks with just less than a millisecond of latency without having to patch together different functions all thanks to Intel’s new Stratix 10 field programmable gate array (FPGA) chip. In other words, it can easily handle complex AI tasks as soon as they are received.
Unlike many of its hard-coded rivals, it is notably far more flexible. It depends on a ‘soft’ dynamic neural network processing engine collapsed into off-the-shelf FPGA chips where competitors usually require their approach locked in from the outset. It can work with other systems such as Google’s Tensorflow, and it can handle Microsoft’s own AI framework, the Cognitive ToolKit. One can construct a machine learning scheme the way he or she likes and expects it to run in real-time, rather than letting the hardware command the user’s methods.
It is no surprise that Microsoft intends to make Project Brainware accessible through its Azure cloud services so that companies can take advantage of the live Artificial Intelligence. However, there is no guarantee that it will receive wide adoption. But it is apparent that Microsoft does not want to concede to companies that make a big deal of Internet-delivered AIs such as Facebook and Google. It is betting that if companies know that they have more control over how AI runs, they will readily throng to Azure.