Lecture presented by Shachar Oz at UXI conference in February 2012 (http://uxi.org.il/pages/13265).
Shachar is a game and user experience designer at Omek Interactive Studios. For more information on Omek and its products, visit www.omekinteractive.com
39. Thank You For Listening ,
Shachar Oz
Game & UX Designer
Omek Interactive Studios
www.OmekInteractive.com
Shachar@OmekInteractive.com
Notas do Editor
I work at studios of omekOutside you see my day jobI’m a uxNot about the game
Menus are forgotten, Bad UI system can drive users crazy
In order for users not be mad…
In one look, at any time, the user understands:Where they areWhere they wereWhat can they doHow can they do it
5 minWhat is it? Sensor gives data of the distance from the camera per pixel. Bat like view of the worldDemo: video tracking viewerhttp://youtu.be/UGGKh3cyc7w?t=1m00s
5 minWhat is it? Sensor gives data of the distance from the camera per pixel. Bat like view of the worldDemo: video tracking viewerhttp://youtu.be/UGGKh3cyc7w?t=1m00s
Gives:Body tracking: 3D position and rotationNatural gestures: body languageFree from controllers: wet hands, accessibility Not give: Tangibility feedback100% stable & responsiveness (not yet)Small & accurate action square: 10cm X 10cm (Windows mouse)
5 min
I can touch a button and still it will not do anything. I know when I am going to press a button. No errors.I can make the movement of clicking even without clicking. No errors.We want to create a no error environment for that tech as well.What is the amount of feedback we should give, to create the virtual tangibility?
User research shows 4 main types of actions with cursor: inactive, reading, action (click/ scroll ..), examineWith mouse you have accurate control of cursor. Again no errors.How can we know when the user wants to select or just to look at items?
Confidence How do we give the user the feeling that the application knows their identity and position?Where are they at? Does the app recognize them? Does it understand what the user is trying to do?
The app cannot predict what the user wants to do. It tracks movement.Therefore there is always a small delay between the user decision to make an action, and the gesture itself. We need to design the feedback to be as quick as possible or to be combined with the user’s motion.
Click : imitate the mouse, move a cursor across the screen and push hand forward to select. Remember position before a few framesProblem: Non detectableBig movement relative to the sourcehttp://youtu.be/VLzHtsRgEBY?t=1m 1:00-1:20
2. Hot spot : stand and wait……..We don’t like waiting! Web development, ui design, etc. Problem: not differentiate between selection and scrolling. Kinect hub emphasized that difference by starting the selection process only after the cursor is above the item and not moving. This caused even more delay in the selection processKinect menu 0:39 http://vimeo.com/16779301
3. Significant gesture : body as a keyboard. Problem:uncomfortable and unusable Need trainingExamples:Adrenaline Misfits / Konami : (gesture based menu) http://youtu.be/vA6xfK7bl94?t=2m10s http://youtu.be/UGGKh3cyc7w?t=2m45s
4. Tracking based : swipe gesture with constant feedback for the process and option to cancel at any point.Problems:Differentiate between 2 modes.Needed to be taught.Examples:Fruit Ninja / Microsofthttp://www.youtube.com/watch?NR=1&feature=fvwp&v=yeu9qOEUJ-gPowerup Heroes / Ubisoft: (improved Dance Central’s UI)http://www.youtube.com/watch?v=3WWyLuAZKjUOption 2- http://youtu.be/ZBh_1cO9EeE?t=1m50s
5. Grab gesture : relatively Intuitive. Grab gesture (drag n drop).Problems:Not that stable.Too accurate.requires high quality camera data, and currently to stand close to camera. Examples:http://www.youtube.com/watch?v=W6eSUJFSRaA/*http://www.youtube.com/watch?v=NKjkQLxwTekhttp://www.youtube.com/watch?v=itmfARsVx5w
More ways:Wave gestureCircle gestureRelease (pull hand back) Move to a special spot
Move cursor freely : imitating mouse. Normal guiProblem:unstable. Delay.Simple solutions:Big buttons. Relatively large movements.Stabling the cursor. Centering it to the item.Examples:HotSpot_item
Tracking based : cursor marked slightly (solve some navigation). Constant feedback. Constant active item.Leave menu from top or bottom.Examples:Powerup heroes
No cursor. Main active spot :Scroll on one axis.Problem:navigation issue (where am i?). Slow reactionExamples:Media center live
When click, stable cursorExample:Jinni live or video 0:40
No cursor, gesture basedExamples:Adrenaline misfitsPanasonic demoMedia Center / Canesta http://youtu.be/3pekrVJU_hg?t=2m40s
Simple button: hot-spot / clickSignificant gesture:Minority ReportDrag n’ drop
Cursor: what if camera doesn’t work? Or user not identified?Camera view: what the camera sees?B&W: grey scale colorsFreeze!: control over the characterTrapezoid: view playfield from upNote over character: main area of concern
Examples:http://youtu.be/jORsG8AG72I?t=1m50s
4 min
Constant feedback about cursor, selection progress & physical locationTracking based methods more stable, comfortable, usable constant feedback Gesture based selection faster, show serious timeShowing a cursor not a must
Constant feedback about cursor, selection progress & physical locationTracking based methods more stable, comfortable, usable constant feedback Gesture based selection faster, show serious timeShowing a cursor not a must
User expectations:One hand based control schemeGrandiose gestures only once in a whileConstant feedback for each movement: physical, joint
Back to basics:Consistency:Do not change model of controlSimplicity:Do not use 2 models of controldo not use more than 1 handEasily understand the scale and the modelFeeling of control:Physically comfortable for the userEmotionally comfortable for the userOption to cancel & undoAble to wander around without false actions. Allow user to think and play without errorsFeedback:Make sure the user get the adequate feedback for the his actions