Week 2 of the first development interval was far more productive in terms of tangible development progress than the first, which was spent contextualizing the architecture and scope of the project based on knowledge and experience acquired during development of a corresponding prototype, as well as reflection on development priorities. Initially the first development interval was to be spread out to encompass an array of features, from getting the 3D spatial navigation mechanics in place to implementing essential menus and outlining various sub-systems to be elaborated on later in development.
However, after deliberation and discussion, it was decided it would be best to reorganize development goals to prioritize the completion of initial implementations of as many user interface and experience features as possible, as the core of this project is the exploration of interfaces and interaction types which are highly unconventional and demand the greatest degree of testing, feedback, and iterative improvement as can be allocated and procured. By restricting the scope of initial development to major user-interface and experience features and the essential elements of the game which these things respectively pertain to, opportunity to detect problems (as well as solutions) in system usability, comfort, convenience and pleasure will be maximized.
Among the team, revised work for this interval was sub-divided evenly. One member focused on implementation of 3D navigation mechanics, including movement of the player avatar in 3D space as well as echo-location mechanics to identify gameworld objects and positional data to the player. This work is being modeled off of a 2D digital prototype developed previously, and while all essential mechanics are materializing nicely when transcribed to a 3D space, at this time, particularly given the non-visual nature of this project, there is nothing to show off for this progress. Movement in 3D space has been implemented and two of the 3 “echo” mechanics will be completed very soon.
The other half of development labor for this interval was focused primarily on implementation of a 2D map interface which will allow the player to detect information about objects that exist in the game-world which are not proximal to their avatar. This information is communicated by sliding axes controlled by the user, which trigger haptic-motors in the input device when a map-marker representing a gameworld object is intersected by one of the sliders. The need for iterative design was highlighted during the development of this feature, as it was and still is not entirely clear how to utilize limited “feedback space”. Given the absence of all visual feedback, the two tools for communicating vital information to the player are auditory feedback and tactile feedback. With each of these there is a risk of using too much, thus confusing the player and drowning out what should be experienced as vivid and vital, or too little, leaving the player still confused and without means to engage with the systems being created. Initially, when moving a slider across the map-screen, a click would sound at intervals when gridlines in the map were passed, and a small vibration would issue from the input device. It turned out, at least in the opinion of the developer, these small vibrations quickly grew to be irritating and distracting, rather than helpful. However, when all vibration was removed with regard to tracking the position of the cursor on the map, it was discovered the absence of vibration when reaching the borders of the map now felt empty, detached and incomplete. Only when strong haptic feedback was added and strictly associated with map border-collision (rather than with any intermediate grid-lines) was an satisfying psychological experienced realized. And of course, there is still much work to be done in improving UX design even with the bare-bones implementation of this menu, which will require still further design work as additional features are added.
One thing that will inform changes in development going forward: although as elaborated above, priorities were revised to reflect desire to invest more heavily in interface and experience design, some progress was also made with game subsystems which have no direct user-facing component and which will not be functional until later in development. This development took place not necessarily because of high regard for interval goals so much as a whim inspired by discoveries made about working with the Godot Engine which is of course the central tool being employed to develop this project. While it is exciting that this subsystem (which at this time is not of much use to describe) is functional and the engineering knowledge for implementing it has been acquired and utilized, significant time was spent with this work which might have been better allocated to interface implementation for reasons elaborated on above.
Because development progress was relegated to system outlines, this entry has been perhaps slightly vague. With future entries as these interfaces and mechanics mature, it is anticipated more interesting and detailed descriptions of progress can be communicated. The possibility of communicating development progress through sound clips and video will also be considered.
In order to make movement easier to understand camera functionality was added. The player can turn the camera to the left or right, but up . . .
A significant amount of work was achieved this interval with regard to integrating the Map Menu with the game world context, building on the foundations . . .