Info
10 weeks, UID, Komatsu Japan
team: Inna Zrajaeva, Oliver Weglinski
My contribution
Expert Interviews, Stakeholder’s Map, UI Design, Wearables Design + Sensory Design Mapping
Introduction
In this research-driven project, the task was to design and develop a control system + interface proposal that reduces the attention required for operators to run and supervise the mining machines
Challenge
Envisioning remote mining operations of tomorrow, creating user-centered human-machine collaboration in teleoperation system. In the future humans will have trouble to keep up with information flows and the pace of the processes, as they don’t have sufficient information-processing capabilities and cognitive bandwidth.
SIRO is a sensorial interface for remote mining operations of the future
REMOTE CONTROL + AUTOMATION
“As operators, we need as much feedback as possible - visual obviously, but also sound and motion”
Operators out of the loop from automated systems
Lack of situational awareness and information about the environment
Lack of skills that could have been gained in a non-remote setting
Operator’s fatigue
Problems in the context of remotely operated systems:
Expert’s opinions:
“Observing two dimensional world is not what we are designed for.” “With the rise of automation, we are becoming less autonomous”
Professional user’s comments:
“I use a lot of senses during the operation, sometimes I even use my nose to smell if there is no grease leaking from the machine”
Team’s focus:
DESIGN QUESTION:
How can we reduce user’s physical and cognitive load through multisensory experience?
Solution:
In this project we deal with multisensory experience related to touch, sound, smell to reach beyond the sense of vision. At the core of the system are wearables that expands senses of the operator. The ability to actually feel more and sense more is highly valuable especially when the operator is far away from the work location.
The operator is able to teach the system a specific strategy of excavating in order to automate the tasks. Meanwhile, the wearables are informing the operator about what is happening during the automated process.
Sense of touch:
The upper part of body - back and arms are dedicated for the movement of the whole body of the machine, while forearms and hands are dedicated for operating the bucket which is similar to the structure of the excavator itself.
Feeling the machine.
The location of the wearables is chosen by 3 main factors - weight distribution, motion independance and sensitivity to the passive touch.
Sense of smell:
Different materials are represented through different smells. It’s possible to specify a type of odour (e.g. minty), a level of intensity (depending on how important the information is), frequency (e.g.every time a certain material is detected in the excavated portion).
Detecting different materials.
“The smell is getting more intense, I suspect that this portion of material has more gold inside. I need to be careful and adjust the movements of the excavator because this type of material needs a different approach.”
Sense of hearing:
With sound we can create a soundscapes containing information about the weather conditions, location of the mine and its size. When trucks are moving around in the mine they create different sounds depending on the proximity from the excavator.
The symphony of the mine.
“I can hear that I’m in Kiruna, in the north of Sweden, it’s a big open-pit mine. It’s freezing out there, and dark. There is a lot of active operators, and there is a truck coming in 5 minutes.”
Sense of vision:
Vision sensors has edge detection, contrast enhancement and image recognition and 360-degree vision. It's possible to detect a specific material using x-ray sensor. It enables the operator to literally "see-through" excavated material.
Seeing through the rocks.
User gets a visual feedback via 2 screens. The front screen shows the first-person view as well as the side-view of the excavation process.
PERSONAL OPERATION SPACE
1 OPERATOR + 1 EXCAVATOR
RESEARCH:
Big machines and people with
even bigger hearts
Research trip to meet miners, operators, truck drivers, supervisors and dispatchers
We dived into the complexity of structures, responsibilities and tasks by talking to the workers, video and audio documenting and getting to know more about their work and life from their own unique perspective.
“Best part of my job? Lovely people!”
“It a chess board - everyday is exciting.”
Exploring possible futures and making sense of gathered material with experts
Designing for the senses stood out from the research presentation
Research presentation covered the full research process. We presented our 4-weeks research and areas that we’ve found worth diving into. We narrowed the areas of interest, by combining themes of: sensory design, understandable AI and translating the mine environment. With additional research we’ve decided to focus on one operator and one machine to be able to explore the opportunities fully.
Translating the mine - bringing operators closer to the real environment
Keeping operators in a loop (decision support system, understandable AI)
Making operation more engaging (design for all senses)
Concept development - lofi prototypes - testing - formgiving loops
How to combine feedback from different modalities?
How to translate machine actions into feedback?
Which modality is appropriate for which task?
Should sensory experience mimic the reality?
Framework development: sensorial matrix and alchemy of the senses
Sensorial exploration: What task - what feedback
We worked on an in-depth analysis of properties of senses and specific features of sensorial feedback - which is what type of action should be applied to convey the information.
Sensory analysis: Characteristics of the senses
Simultaneously, it was necessary to decide on the number of modalities needed for a certain task - it can be one, or multiple. Then, we can also manipulate the sensory characteristics to explore ho w one modality can influence the other.
Sensory mapping: Human body as a “natural map”
The upper part of body - back and arms are dedicated for the movement of the whole body of the machine, while forearms and hands are dedicated for operating the bucket which is similar to the structure of the excavator.
VISION OF THE FUTURE
Reflection on the bigger picture
User-specific multi-sensory interface
Constantly adapting to operator's need and to a specific task.
New, universal language
Establishing a unified framework for remote operations that utilizes multiple sensory inputs.
Embodied cognition
Utilizing hand gestures, locomotion and voice commands in order to operate the machine.
Open-source library
Expanding source of knowledge that seeks to formulate a new design methodology.
Important learnings
I understood the real challenge of testing - it’s difficult to create an experience where all the features are included and at the same time testing them in isolation doesn’t bring close-to-real feeling which is exactly what happens in development
Who is the user? - this question we asked ourselves multiple times - and it’s tricky, because the user doesn’t yet exist in this very specific area and professional users with their knowledge and experience can’t easily relate to challenges of the remote operated system
Taking time to investigate fields where some elements of remote operated systems or sensory design are already implemented or tested (in this case - military, advanced emergency systems, drones, some multi-sensorial VR experiences)