Wednesday, October 20, 2010

Audio half of the audio tour


Our audio is recorded and edited! I met with Ben again today to follow up on the recording session we had last week and to assist in editing it. Although it was really him doing the editing, it was useful to see how it was done and get an understanding of the process.

So we now successfully have our audio and so now it is a matter of splitting it up for the different 'sections' of audio i.e. in the Unity application, the 'sub-bubbles' will give access to different sections of the content if the user if only interesting in, for example, the medium of the work. The PDF below shows how the content has been divided up into different headings which would make up the sub sections of accessible audio content.






Realistic Dimension

This week I have been attempting to make our Unity recreation of the gallery space more realistic. If deployed, this part of the app would create a more augmented gallery experience, combining the basic requirement of navigation with the more abstract exploration of the concepts and information about the artworks simultaneously

To do this, I have been working to try get the artworks and gallery to a more a more realistic scale. I mocked up the artworks to scale in Illustrator and was surprised at how they actually varied.

As we hadn't been given a scale or any measurements with the floor plan of the gallery which we'd used to build up the 3D model of the space, I had to judge by eye what looked right, using the animated tour of the gallery as a guide. However, once the artworks too had been made to relative scale, some of them seemed ridiculously small.

I debated this situation for a while as having studied art, I know that the size of an artworks impacts the meaning of the artwork itself. For example, studying abstract expressionist paintings in high school, working from small scale reproductions in books and online, didn't compare to seeing them in real life on such a scale where it completely encompasses one's field of view. The size of the artwork is interellated with the medium and meaning which is why I was stressing getting this to an accurate depiction within the Unity recreation.

So for the purposes of this application, it makes more sense to have the artworks at a size where they are visible, especially when taken on the context of the iPhone as platform where you have a much smaller screen size than, say, a computer. Taking usability into respect, the artworks act also as clickable 'buttons', hence they need to be a reasonable size for the human finger to touch without difficulty (the ideal mimimum button size for the human finger is 44x44 pixels). In terms of visual aesthetic, it is important the viewer can actually have a good view of the artworks within the space so they can see what links are being made without having to fiddle too much with the navigation to get in 'closer' to the work.

As the primary purpose of the app is to be used in the gallery itself, the user would have the artworks in front of them anyway and would be able to make the visual connection. So at this point, usability is more important. The scale of the gallery itself is a huge improvement as the initial mock up had seemingly small rooms, high walls, narrow doorways and no ceiling.


(Click to enlarge)

I then worked to try get the righting light. In the animated tour, the gallery looks more cream sort of light and I tried to recreate this. The difficult part is to try get not only the tone right but also the lighting itself. Currently it looks quite dim but is an improvement as it looks a bit less like a 'cold' 3D render.

This part of the project is one which will not likely be integrated as part of the final app itself but rather a standalone component which supports the project as a whole. It is important to deploy it start testing it on the iPhone itself so we can start getting user feedback. It makes sense to us as we have been working with it so closely but it has been a concept difficult to explain to others but we are hoping that if we create it right, it will make sense to the user upon being presented with it.

The phase of creation needs to start drawing to a close in order to finish to test and document this project for our final presentation and end of year exhibition. Feedback we can get now will inform our reflections and give us a greater understanding of knowledge to discuss in our final critique.

Sunday, October 17, 2010

Nagivation in 3D Space

Even though we are not sure we are able to implement the Unity 3D navigation audio tour interface working as an integrated part of the final app, we are still aiming to have it as a functional app in itself for this project.

Once the 3D model of the space was build from the floor plan we were provided of the gallery in Maya, this was imported into Unity 3D and then artworks added onto the walls as game objects ( the scale and positioning of these artworks is not precise ) .

Game objects in Unity can then have scripts and behaviors added to them. Trixi made an initial version to give an idea of what we want to achieve in our plan to visually connect the ideas, audio content and artworks within the 3D space. The video below shows this: when an artwork is clicked, a bubble pops up. When this bubble is clicked, related bubbles appear with related ideas to link to other works within the space. A 'play' bubble also appears which, when clicked, triggers the audio content to start and changes to a 'stop' bubble.



As I haven't don't Javascript before, my initial experimentation was the movement you see when the bubble starts 'bouncing' while the audio was playing.

From here, I created the below flowchart to try plan out exactly how these bubbles should behave in reaction to each other and to the user's interactions. From here it is a matter of figuring out how to code it.


(click to view larger)

We probably will also face a few difficulties when it comes time to convert it over to an Xcode file. We have been working on the standard trial version of Unity so we will have to adapt it over to the iPhone version which will involve changing 'clicked' actions over to iPhone based gestures.

What I've had more trouble with is getting my head around yet another new language. I briefly learnt and last used Java a year ago and with this and general programming understanding, I can look at scripts and documentation for Unity and get an idea of what is going on. I am having trouble writing it myself and figuring out how to make it do what I want to so am getting stuck with this. Especially as I've spent the last two or so months learning and focusing on Objective-C.

Though it isn't overly complex in itself, setting up the relationships will be more fiddly than anything else. Hopefully once I've discussed with the tutor, it should become more clear and ready to deploy onto the iPhone next week.

Saturday, October 16, 2010

From code to iPhone

Version 1.0 of our app has been deployed! This version is the initial integration of the various components that each team member has been working on. The base navigation structure was me, the map view was Ryan and loading the content from database was Dodo. The three components were integrated by Trixi.



It is of course still not without its flaws (and has crashed on occasions where I have been showing people) but it was a good and important experience to learn how to actually get an app onto the iPhone.

From here now I have made the decision to step back from the coding aspect to focus on other important aspect of this project. From the outset we realised we would reach difficulty in that we had six people in the group and everyone wanted to do coding. As a result we have almost forgotten that we need audio content as well, among other things.

I have made this decision as I feel content in having spend some time gaining and understanding of the language and building something functional which has been deployed, even though the final version will most like be far from this. Instead I will be focusing on getting the Unity 3D navigation component up and running to be deployed on iPhone too and getting the content ready. This involves making sure that the audio is recorded and cut up to fit with how we want to use it and any other content we may require.

Additionally, there has also been the idea of modifying the app for exhibition purposes. For the end of year BCT exhibition, we aim to have both the Unity application and the iPhone application ready and set up for demonstration purposes but to make it additionally more interesting, James suggested we modify it so that the content is an audio tour of the BCT exhibition.

As we only have material for nine art works for the gallery tour, this would be a much better way of demonstrating what the application is for and how it will behave, as well as a good way to get valuable feedback on the usability of the app. It would also be a good way to provide people from outside BCT more information about the projects on display and the BCT as whole.

I am excited about the potential of doing this as in previous exhibitions, I haven't felt particularly enthusiastic about showing off my work and none of them have been particularly engaging as they've all been fairly static. I am looking forward to having the opportunity to produce something more interactive to exhibit.

Thursday, October 14, 2010

On Air

Two hour in the radio recording booth and we managed to get all our audio recorded. One thing we quickly realized was the difference between reading something in your head and reading it out loud. A lot of simple grammar or spelling errors which we automatically correct and understand become big obstacles to trip over. A few times we had to stop and restructure the sentences or add or remove words.

We had a few interesting things which came up we hadn't considered. For example, a lot of the descriptions had Maori words in them which was important to get these right. As the tour would be documenting the permanent collection of the gallery, many words would have links to Maori culture and significance, especially as the collection tells the story of New Zealand. Ashlee was familiar with the language so could make sure the pronunciation was correct. Leif's European language was useful in getting the names of some of the more foreign artists right.

On this note, we also made the observations that many people from overseas would't understand Maori terminology and in a lot of the works, the usage isn't explained. This creates the opportunity in the app to create perhaps a glossary of Maori terms so within the written text the user can tap on such a word and get a definition or explanation. This works only to enrich the gallery experience we are creating as it reinforces the meaning of the work in relation to the historical and cultural context of New Zealand art and heritage.

Ben and Ashlee were very professional and familiar with the equipment and the proccess which ensured it all went smoothly. Their voices were authoritative and friendly and they kept good pace for Leif to understand and not so slow as to be boring for native English speakers. It was also good to differentiate between the voice used to read the description and the quotes, as the quotes were to sound more like a conversation with the artist so didn't have to be as formal.

Even though I'd read through the descriptions of the artworks before, it was really completely different hearing them read out. It is too easy to skim when reading words on a page and makes more sense when you have a voice telling you about it. I feel Ashlee and Ben did this really well, I could've listened to their voices all day.

Even though I've been interested in art for a few years and been to many galleries and exhibitions, I have never taken an audio tour. I found myself considering what the reason for this might be and realized it was because I like to explore and learn at my own pace and whim. I've learnt about art by studying it through books and discussions with friends, teachers and classmates. This is what I feel is the most enriching experience and never felt the appeal to take an audio tour. So it was interesting that by recording the audio for our tour, I already felt like I was experiencing the works in a different way. It made me look forward to going to actually see the works when the gallery opens.

By having to voice the audio, I think Ben and Ashlee also got a better understanding of the project. At one point when Ashlee was reading and Ben was recording, the description reference another artwork than the one being talked about. We explained to Ben that for cases such as this, the referenced artwork would be linked to within the app within the space if it was also part of the exhibition or an image shown. Ben described it as 'hyperlinking' in real life and I think this is a very good way to describe and approach it.

So with the audio recorded, it will be cut together into the appropriate sections, ready to be input into the app. Ben and Ashlee have gone above and beyond to help us out and it has been good learning a bit about their department works and what they do. They even have a radio station, Static 88.1 so have a listen and support them!

So with the audio side of the audio tour well underway and the app being pieced together, the 3D Unity navigation interface needs also to be completed and implemented. With one week left until study and exam weeks, the end of this project looms ever near.


Wednesday, October 13, 2010

Putting the 'Audio' into Audio Tour.

As we enter a new phase of this project , recording the audio, we once again need to consider the implications of any decisions we make at this point. Upon meeting with the two people from communications who are helping up with this aspect of our project, I found it was good for us to have to explain the project to someone who is outside the sphere of this project and the creative technologies department all together. During our meeting with them, it was highlighted that it would be interesting collaborating between the two disciplines; they're coming in from disciplinary culture where you have an industry which has a very 'top down' mechanism, where as BCT culture has a different directorship models which is more collaborative based.

Rather than having them just "be in" our project as the voices, we want them to be a part of it, hence by bringing their knowledge and expertise to it, makes it a richer project
. Rather than "its your project, your assignment, how do you want it done" , we want their input as they have the knowledge as to what make good audio and tap into their professional advice, but all the while still considering the requirements of the client and we answer back to them and ultimately the project and its outcome is our responsibility.

Audio is never something any of us have worked with in depth so their technical expertise is definitely something we can utilize, especially as they have the resources and the equipment and know how to use it.

Looking over the written content we were provided brought up the differences in converting written content into spoken content as the two differ but Ben and Ashlee , our two voice artists seemed apt at this. We decided it would be a good idea to take along one of the international students in our group to the recording to ensure that it was being recorded in such a way that they understood it clearly enough as a lot of 'self improvers' would probably tourists from overseas whose first language might not be English.

Important issues are raised in terms of the affect the audio will have on the project, especially within the scope of the long term project. For this shorter component of the wider project, we are targeting specifically 'self-improvers' so we need to consider what sort of voice they would want to hear, while also considering for the future scope what 'character profiles' we have and what sort of voice they'd want to listen to - what is that 'voice' to different people? This is part of breaking away from the 'one size fits all' approach of traditional audio tour guide with the authoritative voice telling you what to do and where to go and,everyone gets the same experience; it is very linear, very prescriptive , its not about you. Rather, art is should be about your personal experience and interaction with it and what you want to get out of it.

Ben and Ashlee are used to talking in a 'radio' voice, and where that might not be exactly the right tone, they're used to the concept of having to sit down and pretend they're having a conversation with that person who isn't there. But that more casual, conversational voice is going to be desirable for some of the character profiles and for most self improver. as it is familiar and friendly. We're aiming for friendly yet informative and after talking with them, it seems they have are able to do the whole range of tones.

Finally there was the issue of how to split up the content into the male / female parts. It was decided that where the artist of the work quoted, it would be appropriate to have a female voice quoting a female artist and therefore the rest of the dialogue would be in the male voice.
This then also works visa versa where a male artist was quoted. This resulted in a fairly even balance between the two voices.

Voice and audio is not something I've had to consider as part of a project before so it is good to have the assistance of people who know a lot about this area.


Wednesday, October 6, 2010

Creating

Phase Five - Creating
23rd August Onwards

The creation phase is well and truly in progress! We have spent a month or so working on picking up the language used for developing iPhone applications. We have various amounts of programming skills ranging between Processing, Arduino, Java, Javascript and C++ so if nothing else, we are familiar with the fundamentals of programming.

For me, jumping into Objective-C and object oriented programming was a bit of a challenge. Most of the last month or so was spent working through examples, tutorials and sitting in with the 1st year BCT students who started learning from scratch about two weeks after we had started. Coming back after mid-semester break it was time to start trying to write our own app.

For my first attempt, I tried to make a series of custom buttons to reflect the interface on the Auckland City Art Gallery Website and have them lead to separate views which would ultimately contain content.

As the client didn't specify what sort of visual aesthetic they were interested in, we decided to stick with just trying to reflect the aesthetic of the website.

The Flash interface which is the main component of the main homepage is clever and aesthetically pleasing but redesigning it for a touch interface poses new challenges as it is a different visual language to web. The initial obvious limitation is size, those three simple buttons running down the side already take up about a third of the screen. This led to the debate of whether our app should run primarily in landscape or portrait mode.

Arguments for landscape were the strongest in that a larger proportion of artworks were landscape and that buttons content could be easier arranged to be more aesthetically pleasing.

Arguments for portrait mode were that table views (which we are highly likely to be implementing in one, if not many forms) work better in portrait mode. I argued that it feels more instinctive to hold an iPhone in one hand and, hence, in portrait, while holding it in landscape feels unnatural to hold in one hand and almost demands to be operated with two hands.

The argument seemed to be that, overall, landscape worked better for the visual aestehtics and content while landscape worked better in terms of physical usability. Obviously we can have it operational in both modes but this would require more intensive coding and might be less instinctive to use as the user would have to switch between them and thing about how they have to use it. There is also the added delay in the time taken for the screen to rotate.

In the end we decided to work towards a landscape orientation for the time being and spend some time mocking up some visuals to reflect these ideas. From here we began to build our visual mockups.



Here I attempted to amalgamate the Auckland City Art Gallery visual aesthetic with the iPhone UI features. It is not properly to scale (if scaled down to an iPhone screen, the text and buttons would be far too small.) I tried to make the buttons look more 'pushable' by adding a gradient and drop shadow. The black buttons represent the functionality of the UI Tab bar commonly found in iPhone apps.



Dodo's mock up takes scale into account and the aesthetic still reflects the existing gallery's style. I argued that it looks too flat and no obvious visual cues as to what is pushable. We discussed the issue of the buttons being to small as well to 'push'.



Daniel's is again a bit different, taking on board the colour theme and general style without recreating it exactly. It makes good use of the limited space sticking to the smaller buttons. In all cases I think the semitransparent black text box make effective use of space.

Bones, flesh and organs - building up the app.

With end of semester drawing near fast, it was time to make a plan as to what components needed creating for this app and to actually get it running on an iPhone for testing. A formative assessment with two tutors provided positive feedback on where we were at and suggestions th on how we can work more cohesively as a team.

As we all want a part in the coding, we delegated different sections of our proposed app among the six of us. Our task list was as follows:

- create a Singleton and access it from another view
- navigation for the application (blank views)
- view that loads details from database
- create a structured database and fill it with content
- figure out audio content, record and edit it
- play Audio content
- 2D Map Navigation
- User interface & Graphics

My task was to create the bones of the app, the structure and navigation that the other sections would be integrated into. Despite having initial trouble getting my head around root controllers, view controllers, table view controllers and navigation controllers, I found a tutorial on YouTube very helpful and once I'd created this, I was able to start building on it.

It doesn't look like much yet but I felt proud of what I'd achieved.
We'd agreed that the default aesthetics of the tab bar, table view and navigation controller weren't particularly exciting but there are ways to customize it. This here is the bare bones.

Dodo was working on creating and loading content from a database which she successfully got working. This too is in the early stages of developing and this sort of content loaded from the database is the vital organs of the app.

Ryan is working on the scrollable 2D map with buttons using MapKit and UIScrollView which has been a challenge. The 3D map, which is being Trixi and I have been building using Unity poses its own sets of challenges too.

With only a few weeks to go, there is much more creating to do!