Bots try to shift web designers into quality assurance
Microsoft has introduced an AI-infused web design tool called Sketch2Code that converts hand-drawn webpage mockups into functional HTML markup. It's not to be confused with a similar AirBnB project that has been referred to, unofficially, as sketch2code.
For years, drag-and-drop web page building apps have been capable of much the same thing, allowing users to move predefined and custom objects onto a digital workspace in order to generate the working web code.
These didn't involve AI. But since last year, when Tony Beltramelli, co-founder and CEO of Uizard Technologies, published a paper describing AI-driven design software called pix2code, interest applying the various machine learning disciplines to online design appears to have picked up.
Microsoft, keen to coax customers toward its AI-oriented Azure services and to flaunt its data-processing prowess, would have web designers trade keyboard for pencil and let clever code interpret designer intent from doodles instead of relying on some app to slavishly respond to explicit commands.
Front-end web jockeys, freed from the burden of applying their expertise, can look forward to the creative satisfaction of quality assurance, a phrase which here means checking the AI's work.
It might be argued that Sketch2Code lowers the web design bar – not all that high to begin with given the capabilities of existing tools – to admit casual scribblers into creative crowd while decreasing the time from whiteboard reverie to working prototype.
"Once you have drawn these wireframes on a whiteboard, you can take a picture using the web app and then the web app would send that information to the AI service," said Tara Shankar Jana, senior product manager at Microsoft AI, while evangelizing the tech in a video. "The AI service then runs those images against the prebuilt AI model and then creates an HTML codebase followed by a resulting app."
The generated code is available as HTML, XAML and UWP.
Microsoft has assembled an arsenal of AI tools to replace human-implemented design. There's a Microsoft Custom Vision Model, trained with images of hand-rendered HTML interface elements. That's tied to a Microsoft Computer Vision Service on Azure, to translate handwritten text into a design element.
The firm's Azure Blob Storage service handles the storage of the generated code files. And an Azure website presents auto-fabbed pages to the world. All of which is coordinated by an Azure Function on the backend.
Assuming the optical character recognition code works well, the website sketch only incorporates UI elements described in the vision model, and Microsoft's code runs without issue, Sketch2Code could prove helpful. At the very least, it will provide a way to check whether the Azure billing cycle is reliable.