Well it seems it was eight years ago that I came up with the idea and still no-one has "already done it". So with the help of the Unreal 4 engine I am trying myself. Head over to Astonished By Design and see how mapping the eyeball rotation to the mouse/right controller stick and having the head automatically track the direction of sight which consequently makes the eye change it's position in 3D space, makes the basis for the "First Person Simulation". The first complete experience will be called 02:20 begins at 2:20 am that takes 2hrs 20min real time regardless of what happens but what happens depends on the player.
8 Years On
Are you saying that no one has made a first person simulation of some sort?
Like where you move the mouse or right stick and your point of view changes accordingly?
Have you been under a rock for the past twenty years?
@sinusoidal: I think he's saying nobody has made on in this style. In almost all FPS games you control either the head or the floating gun. In this, as I understand it, you're controlling the eyes? And by moving the eyes around in 3D space it forces the head to move too, but only as far as it would be possible without becoming disjointed from the body. The video shows a part where by just moving the view point (eyes) right, that a lean is achieved as the eyes drag the head, which bends the torso but is still anchored to the ground by the legs and feet, which are not moving because they're controlled by the other stick/wasd.
Is that like how VR games need to model the head and neck movements when "looking around"? I'm not sure if I follow.
Hmm, interesting. I honestly don't know if this has ever been attempted in a game (I'd imagine someone must have had the idea before) and it could prove to be special. The only problem is that it means even more input options. Sounds like it might work with a head tracking device like Rift or TrackIR.
If you're gonna do more dev videos I suggest having some sort of debug overlay that shows which control is being moved at any time.
Are you saying that no one has made a first person simulation of some sort?
Like where you move the mouse or right stick and your point of view changes accordingly?
Have you been under a rock for the past twenty years?
Instead of just two axes of control (body and head, which on a typical controller would be the left and right sticks) the OP is suggesting to add a third axis, the eyes. So you'd move the body, the head and the eyes individually.
Ohhh, I get it. I think. Actually, as I write this, I'm no longer sure. Seems like something I'd only get after trying it out myself.
Umm, but there's that one part in that video where you mention no WASD or left stick is being used, and that looked impressive.
Here's the demo:
I bet you have to get hands on with this to understand why it's not just your standard FPS controls...because it sure as shit looks like it's just your standard FPS controls.
@robertorri: @jesus_phish: @sinusoidal: yes, it is really hard to tell by not controlling it but lets see if I can explain better. It is still a first person perspective which yes has been around for years yes. The difference is that you are directly moving the eyes and the head automatically follows your eyes. So the mouse or right stick directly maps to eye rotation access. The head then moves to follow the eyes automatically. This is different because if you notice in real life if you rotate your head the to the left. The eye is a child of the head so moves in an arc physically in 3D space. Try it yourself. You sit stationary looking at your screen. Somebody walks by you to your left. You glance at this person for a moment so the position of the eye is still the same (the rotation has changed) but then if you look at the person, your head rotates and as a consequence of your eye being attached to your head, the position of your eye changes not just the rotation. Your eye then naturally rotates back to the direction of your body. Keeping your head still move your eyes to rotate back to the screen. You can now actually see behind the left part of the screen. In the video, there is no extra leaning going on, it is the natural effect of the view changing because of the head and eye positions.
Basically I have programed the automatic tracking of the head to the eye movement in the new Unreal Engine. The usual wasd does work with it too and there are no extra controls required for the "leaning" effect which is inherent to head movement (although the movement is exaggerated to an extent). Years ago I tested it out with Director, then Unity (video can be seen on the YouTube page) and now Unreal, which it definitely works best with.
@49th: It actually does yes but in reality when controlling it in that Unity build shown in that video it is so easy to break the simulation! i.e. very easy to get into an uncontrollable spin. Thanks for the support.
Interesting. Looking at the videos the main difference I see is a slow movement of the camera, mimicking the head, repositioning itself based on the position of the eyes. The position of the focus point changing from the center of the screen to the sides, and slowly reasserting itself to the center again. Is that the core concept?
It's a cool idea, that it is just sadly so subtle most people are hard to notice it.
Interesting. Looking at the videos the main difference I see is a slow movement of the camera, mimicking the head, repositioning itself based on the position of the eyes. The position of the focus point changing from the center of the screen to the sides, and slowly reasserting itself to the center again. Is that the core concept?
It's a cool idea, that it is just sadly so subtle most people are hard to notice it.
The centre of the view (eye forward axis rotation) is directly dictated by the right stick/mouse movement exactly like any other first person set up so the sensitivity and move speed the centre point of the view can be altered as normal. The speed of the head tracking and eye returning to the head straight position is build into the simulation so the trick would be how these speed variables change depending on the situation i.e. if you are in the driving seat of a car you quickly glance down to see your speedometer, moving your head very slightly but a quicker move head speed would be required if you turned your head to look at your side window. In the video the speed is fixed with smoothing/dampening between positions.
@videogaiden: You made a great work man! That is both an increadibly insight into the coordination between the head and eye, and a great application to both computer programming and video games. I wish you success to match your work and dedication. And despite coming off a subtle innovation, we live in an age where there is great care towards details, especially in video games. :)
@videogaiden: You made a great work man! That is both an increadibly insight into the coordination between the head and eye, and a great application to both computer programming and video games. I wish you success to match your work and dedication. And despite coming off a subtle innovation, we live in an age where there is great care towards details, especially in video games. :)
Many thanks. All I need is the spare time to develop it. Keep an eye on twitter.com/astonishedbd for updates. There is actually a theme song for a potential game application of this I made if you are interested on the Twitter page too.
This seems a lot like Mechwarrior's arms/torso system. There was also a free-look that wasn't constrained to the motions of the torso. Obviously this all predates modern analog controls.
@videogaiden: You made a great work man! That is both an increadibly insight into the coordination between the head and eye, and a great application to both computer programming and video games. I wish you success to match your work and dedication. And despite coming off a subtle innovation, we live in an age where there is great care towards details, especially in video games. :)
Many thanks. All I need is the spare time to develop it. Keep an eye on twitter.com/astonishedbd for updates. There is actually a theme song for a potential game application of this I made if you are interested on the Twitter page too.
uhhhh interesting :) It's always awesome to see someone creative following their visions! I'll listen to theme song.
Edit: I like the theme song. It is an good instrumental piece, soulful and intriguing.
This seems a lot like Mechwarrior's arms/torso system. There was also a free-look that wasn't constrained to the motions of the torso. Obviously this all predates modern analog controls.
Yeah Mechwarrior is a great example of simulating a controlling and simulating a Mech and talking of control of Mechs always dreamed of owning a Steel Battalion controller. Hard to find and expensive these days.
I'm not sure of the accuracy of this, I mean I can kind of see where you're coming from, but it kind of just seemed like when you were looking at something your view was just drifting arbitrarily. Maybe it makes more sense when you're in control, but it doesn't appear immediately beneficial (or accurate to how I would look around in real life).
I'm not sure of the accuracy of this, I mean I can kind of see where you're coming from, but it kind of just seemed like when you were looking at something your view was just drifting arbitrarily. Maybe it makes more sense when you're in control, but it doesn't appear immediately beneficial (or accurate to how I would look around in real life).
Yes your right. At the moment there is a slight amount of drift. It's not supposed to be there and will be eliminated during development.
@videogaiden: Fair enough.
Now 12 years on and there is still nothing which simulate the head and the eye so I have started re-development in Unreal Engine 4.
Greetings all. It’s now many more years on now.
Realiview is now part of The First Person Simulator.
If you have any spare time, a view of the introduction video, a play around with demo and your comments would be very much appreciated.
Best wishes.
Stephen
Reminds me of Shenmue, though i know the stick is just controlling head rotation and object tracking is done through pre-scripted camera angles.
A first person camera is usually connected to a crosshair, which is necessary for shooters as the goal is to point in directions rather than navigating a head in a 3D space, i usually find it harder to aim and shoot in games like Escape from Butcher Bay where inputs control a "head" rather than a camera, but similar to Shenmue and the demo video i can see this working well for object finding and puzzles.
I like the concept but it seems like a lot of work.
@cikame: Thanks very much for the feedback.
Indeed, the Shenmue first person sections give a nice feeling of actual connection to the objects in the world as well as giving nice visual focus to the object of interaction. My other favorite but controversial example of tactile feeling of object interaction is Dreamwork's Trespasser but Trespasser's object interaction was taking direct hand control too far.
In regards to my visual system Realiview, the user is neither controlling the eye or the head. They are both automatically simulated and essentially line up to look at the usual central crosshair position, which is still directly "aimed" at by the user. In addition the object surface in the middle "crosshair" location is actually the target.
This system has not currently been optimized for shooters. Here is an earlier test video of holding a weapon.
https://youtu.be/Y_xO3NAR_O8
If you have not already, please give the playable demo a try. The link is in the description of the video.
I appreciate your interest and response.
Stephen
@mikachops: Thanks for the interest. Yes, the thoughts and implementation have been with me for a long time! I am a 40 year old (first platform was the Acorn Electron) who works in IT, so this is one of my spare time things.
About 14 years ago, a thought came into my mind regarding how video games only simulate the first-person perspective by replacing the head with a camera and move the camera with very predictable pan, tilts and cranes etc. much like movie camera movement. I even had the honor to discuss this new camera movement idea with the Oscar-winning inventor of the Steadicam, Garrett Brown.
But basically, I wanted to improve the standard simple gliding movement of the virtual camera seen in games. That led to thinking about having a new look at whole body movement, where instead of the body "capsule" having translations and rotations directly controlled by the user, the feet stride towards that direction and take the pelvis with it, rather than the pelvis/body capsule leading everything.
I have been using Unreal Engine 4 to realise my thoughts. Anyone developing a game in UE4 could at least use the Realiview system instead of using a basic standard camera implementation, as it can be used attached to any existing UE4 character skeletal meshes.
I have also been working on a Realiview plugin for the new MS Flight Simulator and other games that use TrackIr/Freetrack etc.
Sorry that is more than a couple of sentences but it kind of a passion project!
More of the history can be found at my website. astonishedbydesign.com
Thanks again.
Stephen
Please Log In to post.
Log in to comment