In order to make movement easier to understand camera functionality was added. The player can turn the camera to the left or right, but up and down is not utilized in order to keep the player from getting confused about their orientation. Sound is played in a 3d space so when the camera is turned sounds play as if it is coming in the new direction relative to the new position. Since there are no visuals to tell the player where they are facing, or how far they’ve turned a “compass” mechanic was also added. The compass is a fixed spot on the player model that does not move when the player turns. It is always oriented toward the North. When the player presses the compass key a sound plays telling the player where North is relative to the direction they are currently facing.
As an additional improvement to navigation, the Map Menu was configured to load the positional data of 3D game world objects and translate their positions onto the 2D map screen in the form of “map markers”. Via the Map Menu interface, the user can utilize two sliding axes (horizontal and vertical) to read the map. When either of the individual axes encounters a marker, if a gamepad is in use, weak haptic feedback will be issued to alert the marker’s presence. At this point, the alternative sliding axis can then be adjusted until the intersection of the two sliding axes contacts the marker. At this point, a stronger haptic feedback signal is issued, and the name of the object associated with the marker is read to the player. The player then has the option to place a special marker on the map called a “waypoint”, which will allow them to identify the object location in 3D space via 3D positional audio (this is still pending implementation). Additionally, the player can “snap” the sliding axes to the position of any waypoint marker, or to the position of the marker representing the player character position in the gameworld. Spatial intervals on the 2D map screen are indicated by click-sounds. By relying on this audio feedback and the “snap” feature, it becomes possible to approximate the distance between the player character and specific objects in the 3D game world, and the relative direction of the former from the latter. Of course, the compass feature can be used to help maintain spatial-orientation in conjunction with these other features.
Without any visuals for the camera it is difficult to set a good mouse sensitivity speed for the camera controls. More testing is required to determine the speed at which the camera should rotate and ways to make x-axis(up and down) orientation less confusing to players. If the speed is too slow the player might not be able to tell the difference in directions of incoming sounds, but if speed is too high the player might have difficulty using compass to discern the increment at which they’ve turned. Turning in set intervals could fix these issues by simplifying the problem but could in turn make the player feel less free.
The original idea for the camera was to make harvesting easier to understand. The idea was that players would harvest the object that they were facing. This would make it so that only one object was harvested at a time so the player would have an easier time knowing what was added to their inventory however, it is difficult to correctly harvest some objects without x-axis camera movement, and x-axis camera movement can easily get the player lost without some way to reset orientation which has not been implemented. So for now harvesting is still based on proximity to the object being harvested.
Significant discoverability-of-context was introduced by adding distinct audio ambiance to each implemented game state (Map Menu, Inventory Menu and the active 3D game world state). The abstract soundscapes associated with the menus allow the user to identify which state is active without having to provide input and interpret context from the response. From a development perspective this was a minor change, but the consequences for the user experience seem much more significant. Additional navigational ability could potentially be enabled by issuing audio and/ or haptic feedback when the player character collides with objects in the 3D game world.
Going forward, it would be highly desirable to pursue the following: first, a refactoring of implemented systems would go a long way to increase scalability and flexibility. This would be integral to enabling the second pursuit: to the extent feasible, it would be incredibly useful to engage in a brainstorming and rapid prototyping of as many UI/UX features as possible. Mapping features of visually-oriented games and software interfaces to potential counterparts that rely purely on non-visual feedback would allow for a much richer understanding of the direction general design and development should be taken. For example, one such potential feature would be “constant presence” feedback. In place of the walking-echo mechanic, in which the location of objects proximal to the player character are revealed as the player walks, a different method would be used to generally identify objects and their positions. Objects would instead be associated with unique, simple auditory patterns that would constantly emit from their spatial position. As the player character approaches a particular object, this object’s “audio signal” would grow stronger, and as they departed, the audio signal would fade. While this could risk introducing too much noise in the active feedback system, it could also potentially improve spatial awareness and discoverability substantially. This is just one such way in which challenges currently faced might be improved, it is likely there are countless others yet to be identified.
A significant amount of work was achieved this interval with regard to integrating the Map Menu with the game world context, building on the foundations . . .
Several changes were made to sounds in order to make the player experience easier to understand. Several sounds were either too quiet to hear over . . .