SQUIRE, as Apple’s AI researchers call their software, stands for “Slot QUEry Intermediate REpresentations”. It is used for the AI-supported development of user interface prototypes, for example for apps and websites. UI designers start with a request to the AI in natural language (prompt). To do this, an assistant asks for a code example in order to take the structure of the data to be represented into account in the design. The users then determine the basic layout in a series of interactive questions. Alternatively, click on “Auto” to leave the arrangement to the LLM in whole or in part.
Live preview and organizational chart
A current preview of the result appears on the right side of the screen, which is implemented based on the attached example data. On the left, the individual components (text fields, images, buttons, controls, etc.) are collected in a nested view. This enables users to tailor their prompt precisely to an element. It is just as easy to add additional (interactive) objects; the LLM (in this case gpt-4o from OpenAI) even suggests new elements in context.
Study with 11 designers
To test its practical suitability, the developers had eleven web designers implement two projects with SQUIRE: a website for researching films and a presentation of hiking trails. All interactions were recorded and the participants were asked for their assessment at the end. Overall, the majority of test subjects were impressed with the software. However, some were not convinced by the abstract organizational chart representation.
Implementation for SwiftUI?
In the long term, it probably won’t just be pure basic research. It would be conceivable that Apple would include such a tool in its own developer tools – possibly as a “SwiftUI Playground”. In any case, the research work now published on the machine learning blog shows that Apple continues to work with great commitment to transform machine learning and AI capabilities into useful operating concepts.

