vault backup: 2023-03-14 12:14:38

This commit is contained in:
Jet Hughes 2023-03-14 12:14:38 +13:00
parent 1e71ddec66
commit 4c5964e9e4

View File

@ -14,12 +14,32 @@ Natural user interfaces
- better learnability, easure of use. ⇒ support tasks without changing them
- support special needs people
> [!INFO] computer we wear communicate with more locally installed les ubiquitious computers
context awareness
- apps are aware of the environment around the user/the app and can adjus their behaviour to suit it
> [!INFO] computer should behave in a more natural way
> [!INFO] also aware of our current goals
automatic capture and access
- automatically record and store things to remove the burden from humans and allows us to focus on things we are better at
# Examples
-
context awareness
![|300](https://i.imgur.com/NpTeqcL.png)
> [!INFO] phone already knows your location
> [!INFO] uses electroculography: eye tracking without a camera. wanted to know if you could build a context aware systems based on eye movements: copy, read, write, video, browse, NULL. They achieved 70%-80% accuracy, WITHOUT using complicated machine learning.
![David Lindlbauer, Anna Maria Feit, Otmar Hilliges, Context-Aware Online Adaptation of Mixed Reality Interfaces](https://i.imgur.com/pKl0wQV.png)
> [!INFO] measure cognitive load using pupil dilation. with high cognitive load they show less information on screen to stop from distracting you.
> [!QUESTION] if they block other stuff while we have high workload, how do we deci