(1) Summarize the feedback you received on your storyboard.
Overall, the feedback to our storyboard was positive. One of the main points of critique was the lack of environment to show where our persona would be most likely using the app and its accessibility options.Additionally, we had not drawn a face in every frame, which led to the storyboard seeming less consistent. In our storyboard, the persona interacts with their phone. We can see this from over their shoulder, but the visible screen is rather small. We were advised to include a larger copy of said screens when they are visible in the storyboard, so a viewer has an easier time understanding what is happening.
(2) Develop an interactive paper prototype.
For starters, we decided to use Figma to create our prototype. It seemed like a simple but effective tool to use. We wanted to create a prototype to address our user scenario from the previous assignment. That meant creating the screens to activate accessibility settings was very important, as well as the screens relevant to registering a visit to the doctor and sharing/creating documents.As we wanted to showcase our idea for including a voice controlled assistant, we had to add a button for that on every screen, including the button links to and from said assistant. We have not decided to make a different version for each combination of accessibility tools, as that seemed extremely excessive, though it is an idea to keep in mind for the next task.We believe that our storyboard is represented rather well in our prototype, as all the actions taken by Martin should be possible to reenact here as well.In the end, the prototype could probably use some more screens to explore. This includes at least a dummy version of the menu when accessibility options are disabled, so the difference becomes clear (that would probably be a screenshot of the actual app). We are happy with our result overall.
The link to our prototype: https://www.figma.com/proto/FGW7aa5zIfj257zIb1JG90/FirstPrototypeHCIePA?type=design&node-id=0%3A1&t=nJUjm5Zdl5hhyjvY-1
Liebes Team, Sie sind auf einen guten Weg, auch wenn Sie nochmal überlegen sollten, wie denn Accessibility in Smart Phone-Apps realisiert wird. Grundsätzlich ist es so, dass Sie als Entwickler auf die Features der Plattform zurückgreifen und diese nicht selbst implementieren. Daher ist es für mich unklar, warum Sie einen Menüpunkt Sprachassistenz haben. Dazu auch Informationen von Apple:
https://developer.apple.com/documentation/bundleresources/information_property_list/uisupportsfullscreeninassistiveaccess
Hier wird z.B. auch darauf verwiesen, dass es eine Person braucht, die eine solche Einstellung erstellt:
https://developer.apple.com/videos/play/wwdc2023/10032/
Schauen Sie sich hier nochmal die Prinzipien zu Assistive Access an. Ich denke, damit können Sie nochmal Ihre Zielsetzung fokussieren. So kann ich mir vorstellen, dass Sie versuchen können, diese Prinzipien in einem Beispiel (entnommen von der aktuelle TK App) umsetzen können und dann die Umsetzung vergleichen. Ich bin gespannt wie es weitergeht 🙂