Skip to main content

Clever touch-based display could revolutionize 3D printing for the blind

Touch-based display
(Image credit: Stanford University)

3D printing could be made more accessible thanks to a nifty touch-based display which is being developed by researchers from Stanford University.

The idea is to allow blind and visually impaired people better access to 3D modeling and the ability to create objects for 3D printing using the display.

The ‘2.5D’ tactile display is essentially an array of pins that can be raised to different levels – the same sort of thing you might have seen in (obviously much more basic) ’pin art’ toys before – which allows the blind person to feel the shape they are creating.

In other words, it is possible to create a 3D model to be printed and feel what it’s actually like, rather than relying on a sighted person to describe it. Objects being displayed can be rotated and zoomed in on, as Slashgear reports.

It’s also possible to pull off tricks like when, for example, making a model of a cup (as shown in the below video), the top and bottom sections of the vessel can be displayed beside the object.

The touch-based display is labeled as 2.5D because the bottom of the display never changes shape, so it isn’t fully ‘three-dimensional’ as it were.

The device was developed by the Stanford researchers in conjunction with members of the blind and visually impaired community, and represents a new level of independence in terms of 3D printing (and 3D modeling in general) whereby no sighted mediator is required to supply additional feedback on any given design.

Agents of creation

Joshua Miele, a blind scientist who is a co-author on the device’s paper and helped develop it, commented: “It opens up the possibility of blind people being, not just consumers of the benefits of fabrication technology, but agents in it, creating our own tools from 3D modeling environments that we would want or need – and having some hope of doing it in a timely manner.”

The display was presented as a prototype at the ACM SIGACCESS Conference which took place at the end of October, and is part of a larger drive to develop tactile displays led by Sean Follmer, assistant professor of mechanical engineering at Stanford.

The eventual hope is that this touch-based display will be realized in a larger model which can represent shapes in greater detail (with smaller pins for an effectively higher resolution), and is less expensive than the current prototype to boot.

Considerable strides are being made on the accessibility front in modern computing, including in the hardware arena with advances such as this one, and indeed with software like Windows 10 which has made a fair amount of changes – including the addition of eye tracking functionality – to ensure that the OS becomes more accessible.