7 May
2018

Build Conference Report – AI and Cognitive Services

Category:Azure

I am at build 2018. On the stage right now, I am watching a deaf man speak in English with a man who is speaking Chinese. They are using the same app and that is just cool I don’t care who you are.

1 million developers are using cognitive services. Why? Because it isn’t just about speaking recognition and customer service bots.

That said, the speech and other cognitive services are really bridging the gap for the deaf. I remember seeing a hackathon project 5 years ago with a Kinect recognizing some sign language gestures. We’ve come a long way using AI.

This is due to Microsoft Cognitive Services and Azure.

Industry Usage

There is a steel manufacturer in New Zealand and Australia mounting cameras on the back of trucks watching for safety violations using the AI edge. That is, people aren’t monitoring for a potential safety violation. The AI is watching for that. Once noted and recorded, Vulcan is then able to train their employees for more safe outcomes.

Cosmos DB collects the data and a model is trained on it. The results of that run are then analyzed. If there really is a dangerous practice used to unload these huge trucks, there can be an educational conversation with the driver. This is an interesting step forward in ensuring safety protocols.

What might this mean for OSHA in the future?

Conversational AI

From onstage – “If you are a developer today, you need to learn how to build bots. Business bots are becoming far more than novelty phone tree helpers.” In demonstration of that, Microsoft Azure Bot Services has over 300,000 users.

A typical customer service bot shown on screen can speak back in the language the user writes in. Nice. It can also answer lots of FAQs.

While this technology may not seem revolutionary to you, having maybe been the victim of a bad bot experience, this one is quite different as it can understand and keep context for the conversation. It can also go far further than hierarchical tree navigation like those phone trees, understanding the contextual understanding of the conversation.

Finally, you can add you own custom languages. I give it 6 months before we have a new Klingon bot somewhere.

All Up Cognitive Services

But what about using this stuff in real-world applications for your domain?

Tapping into these services is simple enough that developers like us are using them. Building these bots isn’t that different than building any other app. The overall workflow of building a bot isn’t terrifically complex. Try it just for fun.

On stage, the speaker has written out a bunch of prompts in markdown for his bot to learn. This is easy for the human to understand during the training phase. He noted the bots work better taking JSON and there is a command line tool for making this translation. This makes it easier for the bot learning service to understand.

Check out the Bot Builder v4 SDK on GitHub.

Developer Tools

One can add cognitive services as “services” in Visual Studio that show up in Project Explorer. They can be called and coded against right in Visual Studio. There is also a set of templates in the “New Project” for Visual Studio that exposes cognitive services as project types. The speaker just created a new project and it’s coded in Python, which seems to work quite well in Visual Studio. Pretty cool.

These are first-round technologies being released today, like.

Conclusion

As the speaker noted, the AI models must be trained, and almost by definition they are contextually “smart.” We are years from SkyNet and Ai models that go beyond specific contextual usage scenarios.

That said, this article provides several jumping off points for areas of potential interest for you. Pick one and jump in. Make a sample app. You just might surprise yourself on the coolness factor.

One thought on “Build Conference Report – AI and Cognitive Services”

Comments are closed.