What is Arter?
Arter - is an AI-powered storyboarding tool. The AI agent, Art, works by listening for a user's input and, through NLP (Natural Language Processing), parses each word into an image object using image association analysis.
As humans, the need to communicate is paramount. Our primary tool is language, most often written or spoken. But, more often than not, much is lost in translation. We guess that's why they say a picture is worth a thousand words
High Fidelity Mockups
The goal for this class was to gain a deeper understanding of machine learning by exploring its associated technologies. And through that, let our discovery and insights drive our conceptualization/area of focus.
We gained more in-depth theoretical knowledge about how machines learn through various papers and articles. Two publications that stood out, in particular, was the "AI-EDAM" paper published by Cambridge.
"Designing Agentive Technology" by Christopher Noessel which helped us better understand how to design for machine learning and the role artificial intelligence plays in our lives.
Another big initial driver for us was taking inspiration from Code2Pix, which is a deep learning compiler for generating Graphical User Interfaces GUI from images.
How Does Our System Work?
The system consists of 3 components - The app client, the Machine Learning Model, and the Database. The client-side is imagined to be a tablet/web app that takes Text, Voice, and Pen as the input. The ML Model then gathers this data for pre-processing. It is processed through different models and trained with several storyboarding assets such as images, scripts, sketches, etc. It is then deployed and tested on real data. For the Database, we plan on using the Sketch-RNN dataset, COCO Dataset, and voice recognition models.
With the help of Machine Learning, it is possible to off-load the pain point of initial design brainstorming.
Machine Learning can help drive inclusive designs and eliminate edge cases.
Designers can make better design decisions if they understand the working of the technology that powers their projects.
Machine learning shouldn't replace the role of the designer. It should elevate their design decisions and assist them.
Continue to refine the user interactions and the UX / UI Flow.
Cover more edge cases, as well as build out a few more features, such as the object context menu, or the brush tool.
Build a prototype, and train our own dataset.
If you want to learn more about the project or know about it's current state you can go to the website linked above