Once the gallery space was set up in Unity and looking reasonably reaslitic, it was time to add navigation features. With Unity iPhone, the scripting had to be adapted over from the inputs we had be using (touch inputs versus keyboard and mouse inputs) but we also had to consider the practical usability of the best way to implement navigation with these different inputs.
This poses and interesting con flict: how do you interact with a 3D environment with a 2D touch interface? With the touch interface, you have no keyboard, mouse joystick or controller, the traditional components used to move around and interact. This is where the limited screen size and touch interface will be become crucial factors.
With a touch screen device, the screen is the one and only form of input and this method of direct manipulation impacts on how the user interacts with the device. The user has both zero and many locations on the screen - if you have no fingers on the screen, you have no location.
The hardware is the content - an application becomes the entire content for the duration it is running.
As I starting point, we had a look at Penelope, an iPhone app available in the app store which has been built using Unity 3D. The source code, files and a tutorial are available on the Unity website to learn how to integrate UI elements in Unity applications for iPhone.
It o ffers three diff erent ways to navigate through the space, two of which are based around an on screen ‘joystick’ off ering the option of camera relative and player relative control. The third option is a tap control where the fi gure moves to where the player taps.
Unlike this game, for our app the user will not be controlling a ‘character’ on screen, but rather they will be navigating themselves through the space, hence the camera will act as the ‘player’ that the user controls. As this is not a game, the joystick interface is not a relevant one.
The tap to move is the most applicable. However, as we have other elements on screen that the user needs to interact with, it could get too di fficult to accurately select where the user wants to move to.
Rather, the other feature of the iPhone we can utilize is the accelerometer. This way, the user can move through the space by simply tilting the iPhone in the desired direction of movement and tap on the screen only when they want to interact directly with the space (i.e. the ‘bubbles’.)
For our application, this is more intuitive than the joystick and more functional than the tap to move but needs to be calibrated carefully to allow for natural movement as the user holds it so it doesn’t move around when the user wishes to stay still.

To implement this through code, the online documentation and scripting reference was extremely helpful. I was given the code for reading the accelerometer input from the iPhone and using that to move and object. From there is was a matter of callibrating the movement so it seemed natural in the gallery space and setting thresholds for the value so that the user could hold the iPhone naturally and it would only move when the user intended it to. I also ran into issues where once applying this code, the person controller stopped reacting to the colliders on the wall so would move straight through them and this had to be fixed.
The swipe to rotate was a little more difficult. I was working off a code which too the swipe movement across the screen and used it to translate an object. I spent a few days trying to recode this to rotate the camera to give the user the ability to 'look around' and after creating and calculating purely through code, I found a function which did all the calculations and reduced my code down to about half a dozen lines. It was however a good learning experience.
The video below shows a demo of this in action.
This poses and interesting con flict: how do you interact with a 3D environment with a 2D touch interface? With the touch interface, you have no keyboard, mouse joystick or controller, the traditional components used to move around and interact. This is where the limited screen size and touch interface will be become crucial factors.
With a touch screen device, the screen is the one and only form of input and this method of direct manipulation impacts on how the user interacts with the device. The user has both zero and many locations on the screen - if you have no fingers on the screen, you have no location.
The hardware is the content - an application becomes the entire content for the duration it is running.
As I starting point, we had a look at Penelope, an iPhone app available in the app store which has been built using Unity 3D. The source code, files and a tutorial are available on the Unity website to learn how to integrate UI elements in Unity applications for iPhone.
It o ffers three diff erent ways to navigate through the space, two of which are based around an on screen ‘joystick’ off ering the option of camera relative and player relative control. The third option is a tap control where the fi gure moves to where the player taps.
Unlike this game, for our app the user will not be controlling a ‘character’ on screen, but rather they will be navigating themselves through the space, hence the camera will act as the ‘player’ that the user controls. As this is not a game, the joystick interface is not a relevant one.
The tap to move is the most applicable. However, as we have other elements on screen that the user needs to interact with, it could get too di fficult to accurately select where the user wants to move to.
Rather, the other feature of the iPhone we can utilize is the accelerometer. This way, the user can move through the space by simply tilting the iPhone in the desired direction of movement and tap on the screen only when they want to interact directly with the space (i.e. the ‘bubbles’.)
For our application, this is more intuitive than the joystick and more functional than the tap to move but needs to be calibrated carefully to allow for natural movement as the user holds it so it doesn’t move around when the user wishes to stay still.

To implement this through code, the online documentation and scripting reference was extremely helpful. I was given the code for reading the accelerometer input from the iPhone and using that to move and object. From there is was a matter of callibrating the movement so it seemed natural in the gallery space and setting thresholds for the value so that the user could hold the iPhone naturally and it would only move when the user intended it to. I also ran into issues where once applying this code, the person controller stopped reacting to the colliders on the wall so would move straight through them and this had to be fixed.The swipe to rotate was a little more difficult. I was working off a code which too the swipe movement across the screen and used it to translate an object. I spent a few days trying to recode this to rotate the camera to give the user the ability to 'look around' and after creating and calculating purely through code, I found a function which did all the calculations and reduced my code down to about half a dozen lines. It was however a good learning experience.
The video below shows a demo of this in action.
Unfortunately, some of the other code we tried to adapt over which created the bubble behaviors and revealed the content and interaction with the artworks generated a lot of errors. A lot of what Unity handles in the background has to be done manually for Unity iPhone and with the time constraints, we weren't able to adapt over and integrate.
So for the final hand in of this component of the project, we have two standalone aspects which we will present: Unity iPhone running on an iMac hooked up to an iPhone which enables the user to navigate through the space, and another iMac running standard Unity and interacting with the bubbles and content through mouse and keyboard.
If this were to be further developed, the next stages would be to successfully amalgamate the two and have it running on the iPhone as an app (currently it is only communicating through WiFi and acting as a remote). And from there, embedding it into the audio tour iPhone app as an option for navigation with the ‘birds eye’ view navigation / basic map accessible as well. Not everyone would be comfortable using the 3D interface and would prefer to stick to the familiar 2D map.
However, I am pleased that these two components we will be able to exhibit for our final presentation.
So for the final hand in of this component of the project, we have two standalone aspects which we will present: Unity iPhone running on an iMac hooked up to an iPhone which enables the user to navigate through the space, and another iMac running standard Unity and interacting with the bubbles and content through mouse and keyboard.
If this were to be further developed, the next stages would be to successfully amalgamate the two and have it running on the iPhone as an app (currently it is only communicating through WiFi and acting as a remote). And from there, embedding it into the audio tour iPhone app as an option for navigation with the ‘birds eye’ view navigation / basic map accessible as well. Not everyone would be comfortable using the 3D interface and would prefer to stick to the familiar 2D map.
However, I am pleased that these two components we will be able to exhibit for our final presentation.
No comments:
Post a Comment