In this iPhreaks episode, Erica Sadun and Jaim Zuber talk to Greg Raiz about "products instead of features."
Greg runs RaizLabs which is a consultancy specializing in iPhone and mobile development. Ben Johnson from Raiz Labs has also been on the show.
A lot of engineering firms get into lifecycle with their products where they're thinking about the next thing. This often gets broken down into sub-features. The focus then gets placed on that feature instead of what the feature is supposed to serve. Greg gives the example of trying to format a book and when you drag a picture to the left, it jumps to the right and vice versa.
A lot of development teams are driven by sales teams. Sales teams often use feature matrices to differentiate their products instead of thinking about your product in a different way from your competitors. Apple and Steve Jobs were great at this and telling the story that differentiated the product without focusing on features.
They use a scrum methodology to develop the mission statement. Greg explains what a scrum methodology is.
Try not to overthink the product. You don't have to plan the whole thing. Instead, you try to deliver value in discreet chunks.
You could take a timer and ask "What is the purpose of this timer?" It could be time tracking, timing something, or waking someone up.
This helps with team alignment.
Greg talks about working at Microsoft on Windows XP and they determined what the core missions were for XP. Every feature that went into Windows XP had to fit into the buckets they defined.
Both methodologies deliver great software. It's more a matter of how you see and use them. Agile allows you to roll with the punches like new technologies and challenges that come up. You can then reprioritize the features.
Delivering each week also means that people can play with something right when it's complete and you can modify it the next week.
The terms can be confusing. The main point of software development is to create something that solves a real user problem
Google Design Sprint methodology focuses on gathering data and making decisions over a condensed work of time.
Android, iOS, Alexa skills, medical applications, and IoT products all benefit from this way of thinking.
The people who can see, Google maps will get you to the corner where you can see the bus stop.
Blind people get to the corner, but the buses drive right by them because they're not standing at the right place.
Bus drivers are trained to pick up people who are looking for them.
GPS isn't precise enough to get the blind people within a 2 foot radius of where they're supposed to be.
The app's purpose was the get them to the bus stop where they're touching the bus stop pole.
They considered all sorts of features, functionalities, and technologies, but it was all about the mission, not the feature.
Cities move slowly, so the infrastructure isn't going to change or it won't change very quickly.
Agile allowed them to test several solutions and iterate on what was working.
For example, they created several interfaces, added the voiceover accessibility features to it, and handed them off to Perkins students.
You can test voiceover with the screen off. You can also test for automation and scriptability.
Most UI elements have it built in, but if you do it wrong, some buttons will say "Button" instead of what the button does.
Apple also has sessions on Accessibility at WWDC.
Focus on micro-navigation problem. They count on other maps to get you close and then do the micro-navigation to get you closer.
They crowdsourced clues which are explanations of what you'll see if you're facing the bus stop to the left and the right of the bus stop. This tells the blind person how to figure out where they are in relation to the bus stop based on what they're detecting with their canes, etc.
If you're appoaching a stop, then it tells you if the street is on the left or the right. Then it tells you what you'll encounter before and after other objects to help you navigate.
A clue can also be what the stop is attached to like a wooden or metal pole, concrete wall, etc.
Raiz labs had several people in the area and people who worked in the Boston office.
After a while, people started trying to find stops that didn't have clues.
Then the city got involved and started noting cracks in the sidewalk and things like that.
The followed the HIG and thought about how you build standard interfaces.
They also considered "When I get here, what do I want to hear?" They wanted to make the voice interfaces conversational.
What information does the person at this point need to hear?
People who use voiceover usually use it at around 5x the speed that Greg uses it at. Voiceover generally doesn't slow people down.
Having the mission statement and the story behind it. Understanding the use cases helps keep it simple and limited to the story around the application.
"Wouldn't it be great if..." sometimes made it in and sometimes didn't with BlindWays.
They use stakeholders and project managers they work with to make sure that everyone has the same vision and to prioritize the right things that bring value.
They ruled out several solutions because they were too complex or would have over-scrubbed the data.
They instead looked for things that specifically would be to the left or right and could service the majority of bus stops. They also allowed people to say if a clue was helpful or not helpful.
Citizens Connect where people could take a picture of graffiti or pothole and report them to the city and then city would report back repairs.
In this case, they're trying to reward contributors by closing the loop on clue contributions.