Transcript
Doyle: We’re going to do a Q&A panel. We’re going to dig in a little bit more on how augmented, virtual, and extended, and mixed reality unlock the ability to integrate the power of computers more seamlessly into our physical three-dimensional world. Designing that user experience of these next generation UIs to be as inclusive as possible comes with a lot of challenges. We’re going to sit down and talk about what are some of the insights and strategies for creating more inclusive and accessible XR environments.
Our speakers today, we have Colby Morgan, we’ve got Dylan Fox, and Erin Pañgilinan.
How User Testing Differs with Accessibility as Priority, and Best Practices
How does the process of user testing differ when accessibility is a priority, and what are some best practices?
Erin Pañgilinan: My friend, who’s a co-founder of my previous non-profit that I co-founded with her for women that were working in ARVR, it was called ARVR Academy, Suzanne Leibrick, who used to actually be in Mixed Reality at Intel at the time, she talks about the concept of gameplay testing. If you really think about even in like AI development, you’re not going to start with the most gigantic foundation model and the most amount of data. You’re going to start off with something really small and simple and test it on 10 people. What I always encourage people to do since 2016 is to try as many different VR or AR, XR, demos as much as possible. When you’re actually developing for a single experience, you’re going to start with maybe just a few interactions at a single level and see if it works with actual people.
If you get more than 60%, maybe you’re in a good range. A lot of the time you’ll actually try it in a bunch and you’ll realize, this actually doesn’t work. That way you’re saving a lot of time when you’re gameplay testing. That’s the first thing I would say. It’s pretty easy. It’s just most people who haven’t developed for games before, like I said in the background, but it’s also in AI, in that space that haven’t done it, I think you’re going to always rapidly prototype the smallest MVP possible.
Dylan Fox: I think for me, one of the challenges that we see with XR, as opposed to more mobile and desktop technology, is that it generally assumes that most people have either a desktop or a mobile phone at home, and that they are able to turn these on and operate them and use the application, often remotely, without necessarily needing any external help. If you want to have inclusive XR experiences, the unfortunate reality right now is that there are a lot of people who would be interested in using these, but may not have a headset at home, and even if they did have a headset at home, may not be able to take it out, get through the setup process, and put it on themselves and activate your application independently.
Something that I noticed when I was running the experiments we did at UC Berkeley on using the HoloLens for low vision folks, is that these systems were not necessarily created with the idea that the person who is in the experience may not be the one who is operating the experience. I would love to see more support for experimenters, more support for caretakers. There’s a lot of times, especially with new technology like XR, where you want to empower the person who is wearing the headset to just focus on the experience, and you want to offload some of that setup and execution to somebody who is helping them on the side. Making sure that you have those good tools, so that if I’m trying to run an experiment or user test for somebody who is totally blind, I can hand them the headset, they can put it on, and then I can get them into the application, get them set up, and monitor their progress. That’s something that’s not super well supported right now. It would be lovely to see better support for it.
Colby Morgan: Especially when it comes to accessibility, I think that trying to get that early in your process, so as you’re developing those features, it’s part of your design process early on as well. Again, you’re having opportunities to develop or solve issues before they become issues, especially when it comes to accessibility. I think a lot of it is just that early mindset, too. Accessibility and inclusive design is really about good design in general, so just really trying to make good products and good experiences that have a lot of that baked in.
Again, you’re reducing the amount that you have to do later on down the pipeline. I think, especially for user testing, obviously it’s a really critical part of that, and so I think as much as possible trying to work with your users or the communities, people that can really give you the solid feedback about the accessibility tools that you may be working on or integrating. I think the big thing I always think about, too, is it’s really easy to check a box or just get a feature in that it checks the box and does the bare minimum of it, but a lot of times finding what’s giving users value is really critical to that step, and especially with XR in general. Again, it’s such an immersive technology, and so you just want to make sure that the tools that you’re putting in actually provide value for users.
Erin Pañgilinan: My friend Nathan Ventura, who co-founded a company called Vinci Games, so he’s YC backed. He’s doing VR basketball. His game, Blacktop Hoops, when I first tried it, it was literally just like, here’s the environment and shoot a basketball. He didn’t have characters yet, which is now tied to a lot of IP and film, but I actually had, it was a demo at the Philippine Consulate, so he’s Filipino, I’m Filipino. One of our other friends, he’s number two in the world at combat design for Street Fighter, but he’s a combat designer in most games. He probably played everything on every console, X-Men. I asked him, I was like, how was it? Did you break presence when you tried this? He’s like, I didn’t, but when I actually tried it, second time, I broke presence, which is something really common when you have VR, is like, I feel like I’m really in this environment, but it skips too much, or it’ll have the most buggy things. You’ll do a ton of work, but in practice, when you find out, I’ve spent maybe hundreds of hours developing this game, and find out it doesn’t work.
Another one I’ll mention is a friend, Jazmin Cano, who actually has a chapter in my book. She’s a technical 3D artist and an accessibility products manager. Her game, which was from Owlchemy Labs, so they were acquired by Google. If you’ve ever played Vacation Simulator or Job Simulator, this was their latest version. We tried it at Meta Connect, and most basic thing, Wi-Fi doesn’t work. When we think about design, obviously Dylan’s work focused a lot on differently-abled or disabled folks, and you’re trying to expand the amount of diverse content creators, trying to get as inclusive as possible with your design. When it comes right down to it, it’s not just good design, it’s just basic engineering and functionality of Wi-Fi is not working at Meta, and so when I’m playing in this headset, I hear voices of other headsets being mixed together in a multiplayer environment with only less than 10 people. I’m like, that’s pretty bad. That’s not the fault of Owlchemy. It might have been the event people who were organizing at this conference.
I like to think of gameplay testing in every type of environment, no matter how big or small. It’s the one thing I’ll say lastly, too. With Horizon, if you thought about during the pandemic, you’re watching Meta Connect in VR, you could probably have hundreds of voices, but once you scale to thousands of people attending, couldn’t really do that. Same idea. It’s also just basic engineering when you think about latency and functionality at the end of the day, not just good design, but basic practices, and like, does this work or not?
Inclusivity and Accessibility, From Day One, in the UI Design Process
Doyle: How do we ensure that inclusivity and accessibility are integral parts of the UI design process from day one rather than add it on later when we maybe find it in testing?
Colby Morgan: I think part of that is, it touches back to the culture of a team and just making sure that the team behind it, the team behind the designs, the UI, the concept, the development, have a lot of those concepts. Part of their workflows, and everyone’s on the same page as far as what you’re working towards. The culture is part of that. For me, XR is such an immersive technology, and the social presence and the connection is such a huge part of that, and so you want to make that as inclusive as possible and as easy as possible to get into. I think for a lot of people, XR is just a lot. It can be a cognitive overload of trying to get into an experience.
Obviously, it takes over your vision and all that. I think just, again, leaning back to just good XR design to make things simple and easy to operate and easy to get people in regardless of their experience level, because I think with every new technology there’s always going to be people that are new to it, and there’s always going to be that struggle. Again, just relying on those designs that make it really easy to get people in and participating.
Dylan Fox: I think the single biggest thing in my mind to make sure that you have accessibility on the radar from the start is simply to have folks with disabilities on your team. If you’re part of an organization that has a lot of folks with disabilities, then it’s naturally going to be a part of the conversation, because you aren’t going to ship something that doesn’t work for your own people. Failing that, making sure you have people with disabilities in the loop as early and as frequently as possible way before you get to the alpha, like part of the design process, is just going to be really important.
I think another thing that we often talk about when it comes to just the cold, hard cash of development is that it is far cheaper to build something in than to retrofit it. If you get to launch, you get to beta testing, and then you find out, actually, it’s not accessible, we need to do this screen reader, or we need to be able to scale our UI. That is way easier to do when you’re first building it out than after you have a million interconnected systems that it breaks. If you ever think accessibility is going to be an issue for your game, your app, your organization, which it probably will be if you ever get successful enough, then it’s better to think about it from the start and save yourself a whole lot of time and headache and money and have just a better product right from the get-go.
Erin Pañgilinan: I put Dylan on all of those points when you think about the end user itself, and thinking about different senses we’re targeting. We saw a talk with Google AR, like how do visually impaired people see and not see. If you’re mute, that’s completely different. If you’re someone that does not have an arm, you’re not going to have a hand controller or a hand. These are really difficult challenges. Something really basic which I opened in my talk was with avatars. I was actually talking with Jazmin about the entire process of the app. She’s like, what? You skipped over my entire part at the very beginning of like, create an avatar. I started off as a bald, it looked like a Nigerian man.
I remember the very early prototypes for Oculus, by default, I was a very tall golden man. It was only metallic colors. I was like, I don’t really identify with this, but it’s really cool. I love Jeremy Bailenson’s work, but it is a little problematic to say like, I can achieve empathy because now I can have blackface on. I’m like, that’s not what this is for. To be culturally relevant I want an avatar or an agent that looks like me. One of the other user tests I’ve actually been doing at Meta, I can’t say too much about it, but they ask you about skin color. I was like, that’s really new and interesting. They never have asked that before. I think they are, at least some of the market leaders in headsets and manufacturing, just trying to think of, how are we being inclusive of more people that are end users, not just the people creating the products? That was never thought of I think five years ago, it was just like, that was a throwaway and a nice to have. Now it’s like, we’re actually thinking about this first. That wasn’t the case.
Also, in the game for Owlchemy Labs that we played at Meta Connect, I was talking to her about like, teleporting is really different. Previously you had to have a controller, whether you’re using HTC VIVE or anything in the Oculus and Meta ecosystem, now it was all hands. It made me think like, but I don’t naturally teleport by putting my hand out to go to the next portal. Is that the most inclusive UI if you don’t have a hand? It really depends who you’re targeting. I think you need to think about, who is that type of user? What senses do they have and prefer? I wouldn’t say there’s just flat-out bad design. There’s just different points of accessibility that people identify with.
The last thing I’ll mention to you from the AI community is for black women or brown folks like myself. I’m a light-skinned, privileged Filipino woman. I will say, people don’t trust AI that are black. Straight up, why? Because the algorithms, if you look at it, basic recognition for imaging on skin. It’s why I’ll go to the bathroom and my air blow dryer will work or water, but that won’t work previously. This is because accessibility is not thought of. It’s the last thing on the list for AI developers. For XR it was a little bit better. It still was at first. We weren’t design inclusive by default. I think that’s now changing. It’s going to take a lot more work, I think, to be much more inclusive. Not just the content and the people developing it. Thinking on base level, what is the data and input that a machine is taking in? What’s the sensory input that I’m receiving? You’re thinking about I/O. What’s the input and output on each end, inputting and receiving? I don’t think that was thought of very critically before. It’s definitely transformed and changed over a number of years.
Team Culture and Accessible UIs
Doyle: What role does team culture play in building these accessible UIs? How do we foster a culture that values feedback and diverse perspectives?
Colby Morgan: I think part of that is really focusing on getting a diverse team, so you have a diverse set of backgrounds and experiences to pull on from your team. As Dylan was pointing out, it’s like, if you have a lot of different perspectives on your team, you can identify a lot of those things. It becomes easier, even as a culture element, if you have just a diverse team, to keep that top of mind with a lot of those different aspects. I’d say the diverse teams.
Dylan Fox: I think in addition to the team, one thing you can do is spotlight your users and other stakeholders that are of different backgrounds, different abilities. We had a project with XR Access called the Stories Project, where we interviewed a number of disabled folks that had used XR, had an interest in it, enjoyed it, but obviously ran into challenges in using it.
One thing we wanted to do was just spotlight those experiences. Because I think it’s very easy for people to assume that everyone who uses a certain app is like themselves. I think we don’t tend to think about people that aren’t like ourselves unless we see some evidence that they exist. If you can spotlight those people that exist outside the norm and help to normalize that, this is for everybody. It’s not just for your standard, usually in tech it’s the standard cishet white dude. Then that can also be really helpful in making sure that your team doesn’t just see this diversity of opinion in the team, but also in the people that you’re creating for.
Erin Pañgilinan: Outside of the obvious ones like race, gender, class, ethnicity, language, region is really important. In any application, interface is to me web development, not even XRs. It’s like, how many different types of phones are there in the Philippines, for example? How many different screen sizes do I need to design for? I’ll never be an Android developer, that’s like too many. I was hardcore iOS for a really long time for a lot of different reasons. I think when you’re thinking about targeting, accessibility, by default, it’s a population and region.
Outside of that, I also consider the type of discipline on the team. For a really long time with game design, it was only people who were AAA developers can be in XR. That was highly biased, especially for third-party independent developers like myself. I did not have that experience. I had a previous life in another career. There’s a lot of bias there. Outside of non-traditional backgrounds, what I would say is it’s actually not engineers to me that made the best experiences in VR in particular, it was architects. People who actually have an understanding of 3D space.
When you think about AI, it’s not necessarily machine learning engineers that have a statistics PhD and background. That used to be the case. I actually think psychologists have a lot to say. Cognitive scientists, anyone who actually works in real science and in the brain in life sciences, there’s a lot that’s to be worked on there if you’re trying to achieve, to me, AGI or consciousness, when you think about presence in VR. I like to think of not just explicit insert my XYZ race, that they say, insert your identity here. I also think diversity and discipline of what you focus on is actually really valuable. That’s now changing.
The barrier to entry for people is also dropping. A lot of the classes I did teach from 2016 and on was everything from K-12 of students using Google Cardboard to employees at Cisco that were developing early prototypes for enterprise. We weren’t getting people who were core C# developers. They were just anyone that worked as an engineer that was trying to think about, how do I think about networking and learning from first-person shooter games, but applying it at game design that’s essentially B2C in a B2B environment. There’s a lot of things that are cross-disciplinary by default that can make things more inclusive and new ways that you can put a lot of things together intersectionally that will create new and exciting experiences and that will actually be more inclusive. Not just, who’s creating it because you’re a cis straight white man and you’re overrepresented.
Instead of like, I think about Dylan as his expertise being like, you’re focusing on populations that typically you wouldn’t think of, or would be playing in an experience that is the most cutting edge and new that is by default much more human and accessible. Whether it’s speech, voice, NLP, whether it’s sensory, touch, gesture, get to the lowest common denominator of who cannot access those things. By targeting those audiences, I’m not going to say it’s because it’s a queer, trans black woman that is disabled. It’s like, that’s who you should target. I’m saying, if you actually support and target those things and put those people on your teams, they’re probably going to have ten times richer the amount of experience, not because of who they are, but because of the disciplines that they have to think about and how they design, how they interact with the experience.
Then on top of that, they’re going to layer, here’s my knowledge and statistics about why this doesn’t work for this population and why that’s not culturally relevant for my community, and why it does or does not work. It’s actually multilayered. It isn’t just straight up like, I’m going to hire a woman. Because of that, automatically it’s going to be more diverse and automatically it’s going to be a great experience. No, you can’t take it at face value. It is much more deeper and multilayered in this process when we think about the words diversity. Diversity, not just like token Asian face, or token affirmative action card. It’s really about like, what are you bringing to the table that is different and new and a different paradigm that you’re contributing to innovation?
Colby Morgan: One of the things I think I’ll add too, with that, I think the other element of getting your team, but also just making sure that there’s opportunities and everyone feels empowered to have a voice on your team, too. That’s one thing, especially with new people on our team, we have to go through a phase of getting them really comfortable with giving feedback and feeling empowered to give feedback. Especially so you have a team where, again, I feel like a lot of projects and early in a project, it can be really easy to fall into a design hole or code silo where you’re really focused and you almost don’t want to get feedback because you’re like, I’m really focused on this. I don’t want to get some of that feedback.
A big part of just having that culture that is open to feedback and everyone’s empowered to give feedback on each of those steps, because I think there’s a lot of those friction points that people run into that they just don’t say anything. It’s really easy, especially when you’re working on XR experiences or features, it’s really easy just to gloss over, it’s like, that was hard, but I’m not going to really pay attention to it. Just making sure that everyone can call those things out, and really, it’s like, yes, everyone was having a really hard time with this. We should actually try to make this better.
How to Design Adaptive UIs
Doyle: Then, how do you design UIs that adapt to individual users’ needs and preferences? What are the current limitations you’re seeing and where do you see this heading?
Dylan Fox: I think the number one rule in my mind that I discussed earlier is that principle of modularity. You should have interfaces where the different inputs and the different outputs can all be done in multiple different ways. To our really great example of this, you can look at “The Last of Us Part II”, which had an incredible suite of accessibility functions where people could do en masse, give me the hearing suite, give me the vision suite, but then tweak individual things. They would have any key thing, let’s say there’s like a scrap of material on the ground you can use to scavenge and use in crafting. If you turn the accessibility feature on, it wouldn’t just light up, it would also make a sound, or the controller would vibrate. There is any number of ways where you could get that multi-modal feedback. I think making sure that your systems are modular like that, that you’re thinking about things, and even things that maybe not what necessarily you think of as something that needs that.
Things like eye contact in social VR, if I can set it so that I can literally feel everybody’s eyes on me if they’re all watching me, that could be a really powerful thing. This is also an area where I’m really excited for AI, because the ability for people to just say to the machine, make this thing bigger, or, can you outline X for me, and just adjust their experience on the fly, would be incredibly powerful. I think that is something that we’re just barely starting to see, but will be very interesting to see if we can have systems that can accompany those types of requests in the near future.
Erin Pañgilinan: Something really basic I mentioned earlier was your headset of choice is maybe informed by your “ethical moral values” of, I blame X about privacy, so I’m not going to use Meta. That’s something I hear a lot. Or it’s the same thing for Google. It’s really tough because most of the companies that are developing these things do have legitimate privacy concerns with AI. Something really basic that they are doing right, which I think is great. I know if it’s a Scarlett Johansson, but I actually had my mom pick, I was like, which ChatGPT voice do you want? It’s like customize your version of Siri. I thought, that’s a cool user choice. We didn’t have that before. Being able to customize to the user based on a set of preferences, it’s like, I didn’t have it before that my avatar by default wasn’t a man.
If you think about cars, my brother’s a big AZN racer, so we think about customizing, modifying the car constantly. Since the era of the ’90s, it’s a big thing. For Asian people, this is like the rice rocket movement. How would I customize my computer? Like CarPlay, take out the screen, put in your own screen, modify the UI. This is my custom window, and listen, now that’s going to match the color of my car. Think about, what if you could customize the headset so that it would be much more accessible?
Here’s one thing when I think about more than just ergonomics. I love the Apple Vision Pro. It isn’t great for accessibility with people with long hair. It’s terrible. No one’s gotten makeup right. Jazmin and I actually talk about this a lot, like, what eyeliner can you wear? Something really basic. People don’t think about these things. Definitely not because there weren’t that many women on these teams. If you think about even the car, how do we design a seatbelt? That was meant in terms of safety primarily for cis straight white men that were over 6 feet tall. I’m 4’10”. When we think about accessibility, it’s not just like, what is my programming language of choice as a developer? What is my headset choice? I was really excited about Apple. I’m like, you can customize the different type of colors of your computer. This is in the late ’90s. There’s so much more choice that we have now. There’s still a lot of things that we don’t have choice in.
Part of the reason I’m so excited for Meta, and they haven’t done this yet, they haven’t open sourced their SDKs and APIs for anything with AR glasses. Many people are trying to hack them. It kind of works, and it kind of doesn’t. I did talk with the director of product at Llama because they had shown some videos at Meta Connect about how they’re using Llama 3, their latest model, with VR. Is it very accessible when you’re talking about multi-modal models, but you’re not open sourcing really basic stuff for people who are not doing model development to just basic software engineers? I think part of that problem is just because there’s so much work to do within the ecosystem first before you open source something to that level of standard.
Optionality and picking the different types of models, even the size of it, so that it’s more affordable. It’s the same thing for headsets. If I could have a cheaper option than an Apple Vision Pro where most of my community cannot afford a $3,500 headset, can they just get something as cheap as a Meta Quest S? That’s probably like a $500 price point, would open up the doors for not only more consumers to enter but also designers and developers from the independent developer and third-party community that doesn’t work at a big AAA game company or any of the other headset manufacturing companies.
Colby Morgan: I think about adapting users and features and experiences for like specific users. Obviously, there’s a lot of different things that you could do and handle. One of the powerful things I really like to rely on as much as possible is just finding tools and systems that can really just adapt to users. Trying to find those broad tools and systems that have a level of dynamic adaption to a player to more seamlessly meet some of their needs, to open up the experience. I want to just create more of a seamless experience that takes more of that friction out of there. Again, I think there’s identifying some of those tools and systems that you get a lot of value out of for a lot of different users. One, how you identify those. Then, how you can make those as dynamic as possible.
Dylan Fox: The more you can embrace users creating their own adjustments in terms of things like mods, things like 3D printing, custom controllers, or custom headbands, there’s a lot of things you can do as a developer to embrace that and support that.
Balancing Simplicity and Feature Depth in UI (VR and AR)
Doyle: What about balancing simplicity and feature depth in the user interfaces, especially in the more immersive environments like VR and AR?
Colby Morgan: Yes, especially XR, it’s definitely a balancing act because it’s really easy to go ham. As an XR developer, it’s really exciting to make things. It’s really exciting to add a bunch of features and effects and different things. I think just as good product design, I always really try to think of how do you focus your product? How do you focus your designs, and really think about your user’s focus? What do you really want them to see? What do you want them to look at? What are you going to focus on? I think it’s the balance back and forth of, how do you keep a real focused experience for the user, but how do you add more engaging features for users that have been using a product for longer? I think that’s the balance of keeping it simple early on for a user, but then just having a clear path for a user to get more exposed to that depth and complexity.
Also, any XR application lets users have the choice of, they want to keep the simple core experience. They don’t necessarily want to try to get too deep with the features or anything like that. They have that opportunity to participate at the level in the experience they feel comfortable with.
Dylan Fox: Some examples to look at are like Google Tilt Brush. You can start off in the simple mode, there’s only a few tools on each of your little tool palettes. Then, when you’re ready, open up the complex mode and get a bunch more controls, a bunch more customization.
Thinking about things like that, thinking about when it comes to design for neurodiverse folks, having the option to turn off distractions, to really bring things down to the barest elements. You’ll see that as well with some low vision support, things like, again, The Last of Us will have high contrast mode where it strips it down such that the player is in blue, the enemy is in red, new items are in yellow, and everything else is in gray. You get this extremely different aesthetic style. It becomes very minimalist. Then it makes it so that even if you’re heavily visually impaired, you can still get that core sense of what’s happening. I think thinking of ways that you can do that, of letting people scale their experience up and down is really good design philosophy.
Erin Pañgilinan: Something I started off with when I was transitioning from frontend and mobile, thought about data science, this is probably like 10 years ago, but I was reading a book on microinteractions. I actually like to borrow a lot from even just really basic web devs. I was at WWDC’s equivalent of ALT Conf. A bunch of the mobile iOS developer community always has a side conference that watches WWDC every year, which is Apple’s developer conference.
My friend Gary was actually trying to create this application where you’re doing a lot of R&D in a lab, so you’re looking at chemistry, beakers, and how you would actually do a bunch of different stuff. It was really cool. I said, but I can’t tell what you’re paying attention to, so reference the paper, ‘Attention is All You Need’, what do you focus on? I suggested to him, I was like, I think you need to use something equivalent to CSS and hover, really basic stuff, not a button, but on the actual objects themselves. Lo and behold, that day, Apple announced like, we now have hover effects. I was like, here, this is what you should use. Really basic stuff like that. I think WWDC has some basic standards, like WebXR, it takes a while, and it hasn’t completely transferred into native yet just because it’s so early in its development process. There’s a lot of different microinteractions and paradigms that have been established already and tried and true, between responsive web and mobile development that do transfer over.
Then, there are some principles that don’t transfer. While I love Apple Vision Pro, a lot of it’s very flat design, it actually doesn’t optimize all the 3D space. Not everything should be a 2D visualization of the stock market, and, it’s in VR, so I’m going to play it in Google Cardboard. This was an experience I tried way back in the day, it doesn’t work very well. It was like, and now I can ride through the stock market like a rollercoaster. I’m like, this is horrible. I was really excited when I did a different conference, this was at Bloomberg Tech conference, and there was a demo for stock trading apps and database. I was super excited, it looked exactly the same as what you would have on a mobile app. I was like, this defeats the whole purpose of having it in Apple Vision Pro.
There are some applications that can be transferred over, but, again, the chapter in my book is on data and machine learning, visualization and design and development, and it’s about 3D in context of the environment, of how you would use that. We also have to think about, what can we borrow from tried-and-true principles of design and development from the web and mobile? Then, what makes it unique that it’s in XR, or even adapting some things, not everything, with AI as well, that makes it much more inclusive, accessible, and something that users actually want.
Questions and Answers
Participant 1: When we talk about in the planning phase, how much weightage should we give for introducing accessibility into our user experience? Do you think it’s fair to say that the MVP, maybe that shouldn’t be part of the MVP, or is it like, let’s cater to the majority of the audience first. Then, as a second phase, we should be looking at the accessibility part, or it should be a discussion since day one? How do we balance out the fact that the majority of your audience, in a way, would be linked to the majority of the revenue for the product? How do we justify that accessibility should be part of the MVP, without that direct correlation to the revenue associated with that?
Doyle: The question is, do you try to push for meeting accessibility requirements and standards in your MVP, or should that be a follow-up? If it should be part of the MVP, how do you push for prioritizing and arguing for the importance of baking accessibility in to that first version versus waiting later?
Erin Pañgilinan: It really depends on your audience, it’s very relative. I’m trying to create a productivity app, this is like a selfish dream, for myself, and so I conducted maybe 30 user research interviews since the end of last year. They were primarily women who were neurodivergent with ADHD, that were product managers, designers, and developers, that worked on open source, AR, VR, AI, and crypto, so very similar to me. I’m designing, essentially, for myself. That’s a very specific audience, and even within that audience, there’s a level of richness that I didn’t expect. I was like, so some of you use paper planners, some people use mobile apps for planning, some people don’t use anything at all, and they just look at their Google Calendar, and maybe they would love all of it if it worked in their Apple Vision Pro, or they’ll only use it and pay for something if it also works in Android and on Linux, like every platform. It’s really hard to satisfy people.
Even in a target market of something that I thought was really small and specific, it comes down to user segmentation and that feature, and then, who is willing to pay for it? Take this quote from YC, Y Combinator, (make something people want) and we’ll pay for it. While I like to think, yes, accessibility, design, inclusive design, and everything should be able to target this type of users or this population first. I really have to be honest with you, some people don’t have the budget for that and won’t prioritize it because it isn’t a part of the bottom line. What it comes down to is like, what set of initial users for that problem are you solving? Then, within that set of users, what are the top three? Really narrow it down. Who are willing to pay for this price for that set of features that should be the real core of your product that is a part of the bottom line? Identifying that, what I realized was like, this is going to take some time. I thought I was trying to be even more narrow and specific, and even then, it’s really hard and difficult.
Again, it’s all relative. If you are a B2B SaaS enterprise app that wants to do telepresence, this is a case for like Cisco or for a client I had in education, like we want hundreds of students to be in XR. We want them to also be able to design fashion and play fitness games and be safer. Two totally different use cases. I’m like, some of that is Meta Quest suite and some of that is Apple Vision Pro. Whatever you think you can prioritize, even just three things, it might be too much. You got to really chunk it down and get really specific, and even then, it still might be too much. Just get very granular to what that MVP is and who will pay for it so that you continue development.
Unless you have endless amounts of money you can like self-fund. Like a lot of my friends in crypto who retired, they can just play computer all day and gameplay tests on all the things. Most people in reality can’t do that. I would just get super specific and think about the context of who your customers are or who your investors are. I’m like, what is it that they want and what is it that you want? Were you willing to meet people in between for your users?
Dylan Fox: These things are generally worth trying to make part of your MVP, and here’s why. It’s because something like a fully-fledged screen reader system or scalable UI or captions, if it’s not 100% perfect right off the bat, that’s fine. We all know MVPs can be very rough around the edges sometimes. The trap that I see people falling into over and again is they brainstorm a big list of features, they pick some as MVP, and then they build out something as fast as they possibly can that meets all of those MVP features. That decides the course of development. There’s a lot of decisions that are made when it comes to developing during that MVP phase that become set in stone and very hard to change after the fact.
If scalable UI is part of your MVP, then you will be building towards that from the start. You’ll be making those choices for your development infrastructure that take that into account. If it’s slated for your 1.1, what might happen is you develop to get something out the door and then you realize, actually, if we wanted this, we would have needed to develop the core engine and core aspects differently. Which means now it’s much more work to do, so maybe it gets shifted from 1.1 to 1.2, and you just start accruing technical debt that if you ever then want to get some of these features in, you’ll need to have a big effort. Because, again, a lot of these accessibility features come down to modularity. That is the type of thing that you really want to build in at the start. Do you need to have everything ready to go? Not necessarily.
Think about localization. Yes, obviously, if our main user base is in English, then we’re just going to focus on English for v1. If you’re thinking about localization as part of phase one, then I guarantee you when it comes time to record all your lines in Spanish, in Chinese, in whatever else, it’s going to be like a few lines of code to change all those out, instead of, “Our text is hard baked into the game. We can’t change it at all. This is going to be a mess”. For that reason, I would say, yes, at least some of these accessibility features should be in the MVP.
Colby Morgan: Echoing both those points, and I think just to add, especially for Mighty Coconut, I think for us, in that early planning phase, it’s important that we have that accessibility mindset of just how we’re identifying the friction point, as I think the friction points or the potential friction points, are the really big piece for us. Just knowing what those friction points are, you can better tackle those down the road if you can plan for those of knowing what’s coming or anything like that.
Erin Pañgilinan: I was developing a productivity app with AI, and I literally was like, this is so fun. Then, I realized this is really broad. You do anything in AI, you think about, the first thing, if you’ve done any workshop with AWS, this is the amount of money I have for compute and spend. Early in the planning process, through user research, before I even build anything, I’m just trying to extrapolate that list of features and narrowing it down to the smallest possible thing. Because even if you think like, I can make this part of the prototype, it’s still a lot of work, so before you spend any money, I would do a good amount of user testing before you build. Then when you actually spend, you’ll realize whether or not it was a waste of money or not.
Participant 1: Actually, one point that stands out, you mentioned, it collects as a technical debt. I think that’s part of that culture. Because this whole accessibility and inclusion bit, I come from the financial industry, it’s just in the start. My engineers have raised a lot of technical debt, but accessibility is not something that has anybody raise their hand that we should have this. We now have it as an organizational principle that it should be there, but, as an engineer, nobody has raised that. Maybe that’s like, start from that culture, saying that, we need to start looking at this thing. Even if it’s not part of the MVP, maybe if it comes up, if people are raising that this is a technical debt, then maybe towards the course of action, it might get an action done.
Erin Pañgilinan: You do have a designer in the room at all when you’re doing this?
Participant 1: Yes.
Erin Pañgilinan: I just think about Apple, it’s like, when a designer comes in the room, somebody would say, they’re like God, I’m like, it helps.
Participant 1: Yes, that helps. Again, the designer is also, as you said, like a phone, where this phone has been two centuries old, to understand that, where we come from. It’s like, that kind of thought process takes a bit of time to come, but it does come in. You mentioned we attended Bloomberg conference, so there should be a difference between the Bloomberg version versus any of the Meta conferences, and things like that. That mentality is still growing. I think that was a good test, that if engineers start thinking that way, that this is a technical debt, then, at some point in time, it’s going to have to be introduced.
Erin Pañgilinan: Technical debt and financial debt, like my cost of computing, we used to say, this is saving developers this amount of time and spend, and to talk really less on productivity, and developer experience, DevEx. It’s like, if I can identify the amount of time and money saved for an engineering team because we included design first, and it affects the amount of compute that we’re going to spend and the amount of time we’re going to spend engineering this feature or product line, then I think it’s easier to make that call. Harder to do for something like a newer problem, because you mentioned MVP, but if it’s for an existing legacy system, that’s how I would quantify it, to be able to get a little bit more buy-in, to include it as more first principles earlier on.
Doyle: If you had to pick one piece of advice, what would you tell newcomers to the field who want to make inclusiveness a core focus?
Colby Morgan: I think I would tell newcomers to the space, one, really look at your user base, look who your target audience is. Find ways to connect more with them directly. Just identify the needs of that audience.
Dylan Fox: If you’re looking for problems to solve that actually mean something, look to disabled communities, because there’s a lot of places in the more privileged parts of the world where we’ve started to invent some bullshit problems. There’s also a lot of people out there who have real problems that, with just a little bit of design thinking and actually listening to the people with those problems, and off-the-shelf XR AI technology, I think there’s a ton of stuff that could be made that just hasn’t been yet. There’s a ton of low-hanging fruit and unsolved problems that really affect people’s day-to-day lives. Please do think about that. We have a lot of resources at XR Access that talk about how to design and develop accessibly, so check those out. Try to make sure that what you’re doing is really going to be making a difference in people’s lives and not just something shiny and novel because the technology is there for it.
Erin Pañgilinan: Be open. Try everything, at least once. Expect the unexpected. There are so many new developments. Some of my friends, they clairvoyantly predicted this was going to happen in XR, and I was like, no. Some of that was Michael Abrash, and we always knew glasses were going to come out and things of that nature. Look to history, so what you’re thinking is going to come in the future. Also, there are things that are popping up now that I did not expect, so in any new emerging technology, that’s going to happen, be really open-minded. When I say try everything, that’s because I literally have probably tried every headset and experience I could get my hands on, because you’ll find a bunch of stuff that works and a bunch of stuff that doesn’t. I don’t just say that for developers and designers, I say that for consumers because that’s a really good way to, one, do user research.
Then, basically testing of like, yes, that did not work at all, we’re not going to spend any more engineering hours, or time, or money on this because we assumed this would work functionally, and it actually didn’t. We expected something that works in principle, is like, the PowerPoint slides or the video, should work. It did not work, even though I gameplay tested it about 10 times over and over now. Yes, be open, try everything, and expect the unexpected.
See more presentations with transcripts