Haptic Interface for Remote Rock-Breaking

general

team

collaborating partner

10 weeks

Professional User Interface Course

Umeå Institute of Design

Eduardo Ferreira

Akansha Aggarwal

Boliden

Swedish Interactive Institute

Oryx Simulations

 

design question

How might we adapt rock-breaking operators to an automated future?

PROPOSAL

Iwa enables operators to feel the rocks.

Creating a multi-sensory experience for mine operators makes their jobs easier and more entertaining, while leveraging the benefits of semi-automation to achieve high levels of performance.

Features

Screen

Live video feed

Augmented rock 3D information

Overview of the mine’s processes.

Control Panel

Navigate the GUI

Control the rammer’s vertical distance from the rocks.

Crusher Pallette

Morphs into the exact shape of the rock that needs to be broken.

Pointer

The rammer arm follows the movements the operator makes with the Pointer.

It can send the command to break a rock and it indicates it is in action with haptics.

How does it work?

How to move the rammer?

How to break the rock?

How to move a rock?

 

Understanding the job

What is rock-breaking?

One of the tasks in mining is to transport rocks from where they are extracted to the plants where they are processed.

Crusher pit as seen from the operators' cabin.

Where to hit/move the rocks?

3D information about the rock’s surfaces

to create their strategy.

When to hit the rocks?

Big rocks don’t come along all the time, however, the operator needs to break it as fast as possible to avoid stopping the production line.

What type of information do they need?

What do they control?

Mainly, the rammer, a mechanical arm that moves sideways, up and down and when it is set on a spot it has a tip that continuously hits the rock until it breaks.

The various experiments MIT Media Lab has been doing with shape display technology prove that our concept could be possible with the right technological support.

 

What if these "physical pixels" that move up and down where the size of a needle and as populated as our HD screens are with pixels?

 

We would be able to produce a very accurate tactile image and this suited the needs of our users and their context.

 

 

Joysticks, conveyor belt and rammer arm.

Amongst the concepts we prototyped, was one where the operator would perform all activities on a touch screen.

We wanted the best ergonomics for the operator

The dynamic surface

Through video feed we provide visual confirmation that the actions performed through iwa are happening.

This aims to compensate for the fact that the operators are not in the spot and cannot see it for themselves.

It is also where the GUI with the mine’s system lives.

The operator needs to see the machine and other processes

The rockbreaking operation is still part of a bigger system.

The operator needs to navigate through it, pick which he/she is going to work on and navigating through it with the Crusher Pallete was a bit confusing.

Many functions must be controlled through the interface

With the pointer we enable the operator to explore the surface and at the same time pick directly a specific point in the rock to hit. This became possible when we let ourselves automate part of the movement of the rock, the operator just gives the final coordinates and the robotic arm calculates the best path to it.

The pointer is the best way to define an exact spot on the rock

The way we control the rammer is very important. During our visit to Oryx Simulations we realized how unintuitive it is to control all the joints of the rammer with two different joysticks.  We tried several prototypes but fell in the same complications when we had to add ways to control all the possible movements the rammer has.

We visited an open pit mine in the

North of Sweden, Aitik in Gällivare.

 

We observed and talked to rock-breaker

operators to guide our project.

Rocks must go on conveyor belts and through Crusher Pits. Sometimes, they are too big to fit through so they must be broken into smaller pieces.

We discovered however that if we got the screen in the right angle and distance for the eyes, it would be too far away for the hand to reach and vice versa; if it was close enough for the hand, then it was too close for the eyes.

Therefore, we divided the interface in two.

Getting inspiration from future technologies.

Why did we do it like this?

The process

Creating an interface for rock-breaking.

final thoughts

Reflexions and My Role

We tried to question a lot of what is being said about automation in the future. Personally, I worry that too much automation will make boring, non-challenging jobs, so I always tried to steer the conversation towards how could we maintain productivity and operator engagement at the same time. For that reason as well, it is that we focused on having something tangible; something more than a screen based task.

We explored with physical prototypes and tried them out with different people, I think I had an important role in encouraging the making of simple and quick prototypes.

 

There are at least two things I would improve about iwa. The first being that the technology we are proposing to use to create a surface from which 3D shapes emerge doesn't have enough protagonism in the concept video, I do think however, that we managed to use the resources and skills we had as best as we could to deliver our message.

The second, is that we should have the feedback and the control of the rammer's vertical distance from the rocks in the same channel.  Right now the way to change this distance is in the Control Panel and the live feedback is in the Pointer, and even though we had our logic behind it, it could be confusing because it is not obvious.

© All rights reserved. Melissa Hellmund.

2018.

Contact: