Musings on designing experiences & (re)engineering complexity
Stumbled across a discussion about user experience (UX) and a new tool from Microsoft called Sketch2Code the other week. Had this to offer:
It seems w/AI & ML we need to stop thinking about computers as “process/calc.” Rather, how they enable us to be better, while doing computing better. We sketch, they code. Can that turn into a positive for all? That’s the kind of questions I think UX should be trying to answer
This kind of response takes a certain type of accountability from the UX community. There’s a challenge it offers to think about user experience away from, and outside of, the (necessary) shape of the space. Yes, the well-worn stereotypical conversation about UX divulges into Photoshop/Sketch versus everything else, the affordances of/battles with Agile development, and whatever new tool/method can be used to convince stakeholders to “care before doing.” None of these are wrong in totality, but they also miss some of the main point of UX: improve the reliability of communication between the user and the producer.
Sketch2Code is the kind of application which seems to share in the sterotypical conversations UX and UI development folks will get into. It is an open sourced implementation of Microsoft’s Artifical Intelligence (AI) engine. It is rough around the edges, yet polished just enough for senior developers, UXers, and engineers to pick up what’s being put down — can a tool augment an aspect of the creative process that’s often where the gap between idea and execution arise (but is not seen until later)? Mentioned in another tweet, sketching is declarative; it is a different mental model than “code and show me.”
Being able to sketch, even badly, pushes conversations forward. Though yes, it doesn’t (and shouldn’t) answer all the permutations of consequences and affordances of users’ actions. But that’s not what users (and eventually stakeholders ask). They literally are asking UX: can you show me my normal, then push forward or backwards in areas that optimize what it is we (think we) see?” The UX conversation doesn’t so much address that question as much as it uses it as a means to launch into tools and methods. Sketch2Code turns that on its head. Then asks UX, does it have the skills necessary to continue the conversation while enabling what’s being built (quickly).
A concept previously shared seems like the logical evolution of Sketch2Code. Because once your can get an idea on a whiteboard, and use the non-seeing abilities of a camera to do something with that board, you only need to interact (test, tweak, and push to a public state). Yet, the ideation of this gets to a stopping point with UX because of the questions this field will ask (what are the transitions, what style library will be used, Sketch or Framer, etc.). It will get built when a developer/engineer tosses those questions aside and just starts building, or when Sketch2Code gets its obvious next “ sketch and interpret.”
For the UX space, when your tools become smarter is a better conversation. The questions that drives, will always equal a better user experience for all.