If I get an extra device here at the office and permission, I promise to send one to you guys.
Also, you can definitely "cast" from Android (including GearVR) to a computer for capture. It's constantly changing and in the early days there was a definite performance drop when casting video from the device, but it's doable.
The one thing that sucks there is the quality is low, there's distortion and dual images due to the video feed being the raw screen feed, as opposed to a secondary clean equirectangular 360 video feed with a user's current "headset view" overlay indicator or something (the best way if you wanted people to be able to comfortably watch exactly what you see directly within VR) or a simple single undistorted cropped "traditional" 16x9 video feed like how the Vive and Rift usually do things.
However, you can also do silent 1024x1024 px 60fps video capture (only square cropped highly compressed un-distorted video capture, not live cast so no live-viewing by external devices) on most applications on Gear VR except during points where Oculus / devs decide to turn off that support during things like password entry, movie copyright protection, etc. by holding the "back" button on the Gear VR during any application, going to tools and choosing "videoshot" recording, then returning back to the application.
@vinny seems like such a great dad. Really smiled big during that opening bit about his little boy asking about toys and astronauts. I hope to live up to the bar he sets with his kids with my own son.
It's to combat motion sickness. From what I can tell the GB folks are pretty resistant to VR motion sickness.
I know that some want to use left analog stick movement regardless, but:
My personal experience with left analog stick movement in VR is that it nearly turned me off of VR as a whole. I tried the oculus demos years ago, and I felt sick for the rest of the day. I thought it was inherently VR giving me motion sickness. Turns out, though, that it's just based on the game and how it implements movement. But I didn't know that until years later when I tried the Vive. So if developers use left analog stick movement in a bunch of stuff, they risk turning players off of VR who might otherwise not get sick.
Personally, I think that stuff like the Minecraft left analog stick movement mode should have a gigantic motion sickness warning. I think that stuff could hurt VR a ton if it accidentally turned players off of VR.
I would much rather left thumbstick movement in VR while sitting down than any kind of teleporting room space VR locomotion (hypotethically). I feel like devs are over complicating VR from trying to utilize roomspace.
Exactly how minecraft in the VRodeo controls is exactly how I pictured games in VR controlling best.
Yep. Came here to say this. Controller-based movement, ESPECIALLY turning should NOT be allowed because users will basically make themselves sick. This is oversimplification, but it mostly has to do with your brain's visual balance interpretation system being at odds with your brain's "inner ear" sense of balance and reading inertial changes. It's why Dan almost fell over on that VR bike game they checked out on an Unprofessional Friday back in April (~1:45:05 in to the video).
Turns out when your brain's various systems used to understand your environment are at odds, "I should barf because I'm probably poisoned or something" is the first thing that your brain seems to want to think.
You might have some tolerance or deal with it better or worse than some other people to some degree, but after some amount of time, just about everyone gets sick from this "conflict."
This is just a theory, but Jeff has probably been building up his own tolerance to it by playing VR pretty often, but I imagine at some point, after playing for at least an hour or so, and using that turn function, he would almost assuredly get sick. He might also just be one of the people who is less sensitive to this. Or he might also be doing it sparingly without realizing it so it doesn't get to his breaking point of getting sick.
I still believe the game that has best done controller-based positional movement (besides the teleportation stuff in places like "The Lab" by Valve) is still Lucky's Tale. They put you in a gentle spring-loaded camera movement, where you follow the character along, but the camera has a lazy lag to it and a dampener to keep speed and directional changes as smooth and gradual as possible. If I had to compare it to something in the real world, I'd say the movement a player experiences in that game is about as gentle as one of those slow Disney Land / World rides like the Haunted Mansion. Yes you're moving, but it's not enough to throw someone off balance if they were to stand up.
Anyway, it's a bad idea to allow the user to change this, in my opinion, because it gives uninformed users who might have a low tolerance the opportunity to make themselves sick and ruin their impression of VR.
On a related note, I have a bit of a personal story about what VR can do to you if you aren't following one of the quickly-emerging "rules." I've been working in VR software now for about 2 years, and was playing around in a DK2 once for about an hour or so one evening after work at the office, and didn't re-configure the IPD to the correct "comfortable" setting after another team member had recently used the device and just figured I could power through some initial mild discomfort I noticed but quickly forgot about with my eyes.
After I finally took the headset off, I discovered my eyes were "stuck" in a semi-crossed state due to having to the lenses essentially forcing my eyes to angle slightly inward to focus on objects in VR... and I couldn't get my eyes to go back to normal.
It was truly frightening.
I was initially afraid I had pulled or strained my eye muscles and wasn't sure what was going to happen. Luckily, after about 5-10 minutes of rest, my eyes ability to focus forward properly went back to normal.
Basically, what I'm saying is - if someone is an authority in VR software development, like Abrash or Carmack, and they advise you do or do not do something... do what they say, or be prepared to face unknown, frightening possibly barf-inducing, eye-crossing consequences.
@jdp83: Do you think we'll ever solve the Wii/Kinect/VR thing of staring at a icon and holding down a button to select it? It's always been really clunky for me, and after spending so long try to make things go faster, it feels way too slow. Do we need to just start tracking controller and such in the space as well, so that there's some sort of tactile feedback?
Actually this was solved on the Kinect with Dance Central. The problem is no one else adapted it. "Hovering directly over for <X> seconds with a timer" sucks because it introduces a mandatory minimum time to interact with something and if a mistake is made, you have to reset that timer.
Dance Central challenged this by saying, "what if we used relative movement and looked at which hand you were using and changed interactions based on that?"
So in that game, you move your right hand up and down to cycle through each menu item, regardless of where your hand starts (the current relative motion direction of the user's hand changed the selection rather than having to hover over the item in simulated real space with your hand as its detected by the Kinect) and swipe left with your right hand to select menu items (which also animate to the left to further hammer the action visually as to what's going on) and simply swipe right with the left hand regardless of where that hand was.
If that sounds confusing, watch this to see what I mean. See in the upper right corner how the user moves his hand up and down and the menu selection changes based on that relative motion as opposed to the hand's absolute position? Then he just swipes to the left relative to that position to select the item. He doesn't have to hover over the item, he just has to make sure he moves his hand up or down until he's on the item he wants, then quickly slap it to choose it. It was BRILLIANT and the single greatest UX achievement made on the Kinect platform... that most people didn't appreciate or follow their lead with.
The Wii was not really a problem with how it handled control - not for the most part at least on 1st party titles. Smart interaction designers made sure that they allowed you to either use the remote as a pointer OR to flip the controller, holding it more like a traditional controller and just use the D-Pad to make selections. Not all did this, but Nintendo did it even on their main screen so anyone who didn't was lazy (but that was the name of the game with most 3rd party Wii development anyway #BringBackWiiWareJeffSegment).
In VR, pure "gaze" based control sucks, because, like with the Kinect games that weren't Dance Central, it means introduction of a mandatory minimum time of 2-3 seconds before being able to progress through a game menu. And it's even worse with VR because you inherently look at things when trying to decide what that thing means, so if you are deciding what button to press and are looking at it, but after looking at one of the buttons for 3 seconds, it gets "selected" anyway, it feels really frustrating.
On devices like Google Cardboard, this is a problem, because it has no other way to interact unless the application not only supports some external controller (99% don't) but also overrides gaze-based control.
It's less of an issue on Gear VR because it has a touch pad which is great because then you can just use gaze to look at an object, but still allow the user to tap to select. You can even introduce concepts such as swiping and scrolling with the pad. In the application I've primarily worked on for the past few years, Samsung VR (formerly Milk VR) [Android platform non-VR component][The Oculus Store VR component of the application can be found in their store on Android if you have a Gear VR's firmware installed on your Samsung mobile device], I fought to get touch-pad swiping and fast-scrolling in and make it relative to the user's face, so swiping or fast-scroll-dragging a certain direction correlated with scrolling movement direction in VR.
Same with the Rift and especially the Vive. Valve's "Big Picture VR" mode is an excellent interface that allows for multiple ways to select items, type, and interact using the positionally tracked controllers. My favorite experience is still The Lab, Valve's own sort of "Wii Sports" of VR.
Just wanted to take the time to acknowledge a well thought out post. Thanks, Duder.
@jdp83: Do you think we'll ever solve the Wii/Kinect/VR thing of staring at a icon and holding down a button to select it? It's always been really clunky for me, and after spending so long try to make things go faster, it feels way too slow. Do we need to just start tracking controller and such in the space as well, so that there's some sort of tactile feedback?
Actually this was solved on the Kinect with Dance Central. The problem is no one else adapted it. "Hovering directly over for <X> seconds with a timer" sucks because it introduces a mandatory minimum time to interact with something and if a mistake is made, you have to reset that timer.
Dance Central challenged this by saying, "what if we used relative movement and looked at which hand you were using and changed interactions based on that?"
So in that game, you move your right hand up and down to cycle through each menu item, regardless of where your hand starts (the current relative motion direction of the user's hand changed the selection rather than having to hover over the item in simulated real space with your hand as its detected by the Kinect) and swipe left with your right hand to select menu items (which also animate to the left to further hammer the action visually as to what's going on) and simply swipe right with the left hand regardless of where that hand was.
If that sounds confusing, watch this to see what I mean. See in the upper right corner how the user moves his hand up and down and the menu selection changes based on that relative motion as opposed to the hand's absolute position? Then he just swipes to the left relative to that position to select the item. He doesn't have to hover over the item, he just has to make sure he moves his hand up or down until he's on the item he wants, then quickly slap it to choose it. It was BRILLIANT and the single greatest UX achievement made on the Kinect platform... that most people didn't appreciate or follow their lead with.
The Wii was not really a problem with how it handled control - not for the most part at least on 1st party titles. Smart interaction designers made sure that they allowed you to either use the remote as a pointer OR to flip the controller, holding it more like a traditional controller and just use the D-Pad to make selections. Not all did this, but Nintendo did it even on their main screen so anyone who didn't was lazy (but that was the name of the game with most 3rd party Wii development anyway #BringBackWiiWareJeffSegment).
In VR, pure "gaze" based control sucks, because, like with the Kinect games that weren't Dance Central, it means introduction of a mandatory minimum time of 2-3 seconds before being able to progress through a game menu. And it's even worse with VR because you inherently look at things when trying to decide what that thing means, so if you are deciding what button to press and are looking at it, but after looking at one of the buttons for 3 seconds, it gets "selected" anyway, it feels really frustrating.
On devices like Google Cardboard, this is a problem, because it has no other way to interact unless the application not only supports some external controller (99% don't) but also overrides gaze-based control.
It's less of an issue on Gear VR because it has a touch pad which is great because then you can just use gaze to look at an object, but still allow the user to tap to select. You can even introduce concepts such as swiping and scrolling with the pad. In the application I've primarily worked on for the past few years, Samsung VR (formerly Milk VR) [Android platform non-VR component][The Oculus Store VR component of the application can be found in their store on Android if you have a Gear VR's firmware installed on your Samsung mobile device], I fought to get touch-pad swiping and fast-scrolling in and make it relative to the user's face, so swiping or fast-scroll-dragging a certain direction correlated with scrolling movement direction in VR.
Same with the Rift and especially the Vive. Valve's "Big Picture VR" mode is an excellent interface that allows for multiple ways to select items, type, and interact using the positionally tracked controllers. My favorite experience is still The Lab, Valve's own sort of "Wii Sports" of VR.
It's called the photic sneeze reflex - also known as Autosomal Dominant Compelling Helio-Ophthalmic Outburst Syndrome (or ACHOO for short)... which sounds like something @vinny would say it's called to Dan to mess with him but is real.
I've been short on pixels lately so I became a space slum lord.
I built a vertical tower with rooms that I put deeds in that barely fit a single hanging copper lamp, a wooden bed, and a wooden door, so I can collect rent easily. To improve my chance to get pixels rather than junk, I also installed the Steam Workshop mod for making residents pretty much only either give random rare items or just pixels.
@zemadhatter: We're definitely not the center of the universe (not even anywhere near the center of the Galaxy we are in) but we kinda can only measure from where we are and consider that to be the center of our "observable" part of the universe.
JDP83's comments