Rashis' WIP


Hi I’m Rashiga Walallawita from Sri lanka(hope Sri lankans are also allowed for the competition)

Currently a fresh undergraduate from Department of Mechanical Engineering University of Moratuwa. I have a great passion for 3d modeling and product design.

Here is my WIP please add your comments and critics. Since I have lot of stuff to learn. :thumbsup:

1. Goals
-A revolutionary design for a user interface or menu structure

  • A revolutionary user experience for the driver of a vehicle
  • A revolutionary way that users can better interact with the features in their vehicle while operating it

Minimum Requirement
-Interface design (Speedometer, Fuel gauge, Navi, Radio, and Climate)

2. Objectives
-Learn interface methods
-Learn interaction methods
-Human machine interface design methodology
-Design for the interface method
-Design a cockpit and dashboard

3. Available methods
[U]Human machine interface methods[/U]
-Tactile (touch/button)
-Voice command
-Gesture control
-mind control
-Pupil movement
-Remote control

Methods of delivering data
Display technologies
-Flexible display
-Transparent display
-Projection technologies (heads up display, hologram displays)
-3d display (with/without glasses)
-VR displays (Oculus)

-Virtual Assistant
-Audio signals

-Change of temperature
-Electric impulse

-Direct mind integration (thought/sound)

4. Proposed methods
All methods are selected considering the viability in near future.
Buttons will be still there – most reliable interface method up-to-date so for most important interaction will be done using buttons.
Touch display – Main display interaction method
Voice control + brain pattern analysis - The system will take into account voice commands and the brain patterns of the user ,and system will learn the pattern and provide better prediction.
Gesture control + brain pattern analysis – similar as mentioned above.
Mind control – not to be viable solution for near future
All used methods will be used independent from each other.


one design came up with


Apple Car play

Land Rover mart Glass

Audi MMI system

Audi + NVidia Vision interface and Tegra integration

Tesla Visual Computing Module

Really like the Land Rover smart glass concept looks very promising.


Future Predictions


My Idea for the future cockpit

I narrow down my scope for a standard family car in near future for my design.

The cockpit will be designed based on to deliver the following mentioned features.

  1. Full manual driving = without any assistance, let say we need to burn some rubber :wink:
  2. Full manual driving + with assistant systems = with all the a safety features.
  3. Full autonomous driving =after a stress full day of work sit back and relax have a good time until you go home.
  4. any suggestions!


Nice ideas,nice concept,waiting for the next step.

All the best



Thanks adib :slight_smile:

Here’s a bit of a pre-vis I did
The video shows the transformation process from manual to autonomous mood.

Video link


Had some more time to do more doodling.

Following image show how the UI system will be when the vehicle is going in manual mode.

This image shows how the UI will be when the vehicle is in autonomous mode.

Still have to do some refining, and have to come up with the UI.

For MMI system planning to go for similar method like apple car play but the interface will be according to the drivers phone preference.
He can choose interface that are similar to apple,windows,android, and custom design that comes with the car.
Reason for going this kind of method is nowadays the used interfacing method is the smartphone. So if the vehicle has similar interface he will be more familiar.


My idea for the heads up display 20% of the most important(compact and concise) information will be displayed here,which will help the driver to keep his eye always on the road. Options can be controlled by voice commands(+brain pattern monitoring) for from the interactive touch pad from the steering wheel.

By using two way input method an novice user or an advance user can easily use this system. Novice user can use the interactive keys in the steering wheel to access the menus.

Mean while the car will analyze the brain pattern and bookmark it with the function the driver uses. When the driver is comfortable he can just call the voice command the name of the function. Which will accurately help the vehicle to judge the function the driver is requesting.
So a more trained user can directly use the functions with making less distractions.

The heads up display is a hologram display. Which has adaptive background colour and brightness setting which will help to display details all the time to the user clearly.

The display can be moved as the driver wants it. Also the display can be expanded to display more details.

This also will display any warning messages to the driver. But to minimize the distraction first the message will be announced audibly by the virtual assistant and it will be displayed in a small delay. Audible notification assistance can be fully customized to the user preference. But in case of an imminent danger the voice notification will be always on.

I hope I will have enough time to finish this :banghead:


Doing design with an engineering background is not easy :banghead:

Please give some comments :deal:


sigh no one is interested in my WIP at least to give a comment


The design of the interior looks really nice :slight_smile:
All the best



Thanks adib

Autonomous mode with less clutter more space to the driver. Specially designed retractable steering will specially help in this matter.


Micro actuated devices, deformable materials, smart materials


Looking good rashi, I see that some of our overall ideas are similar, but I think you’ll beat me easily with the modelling :slight_smile:


Yea just noticed it :smiley: after reading your post.

I’m only good at modeling other things I really suck at it


A little update!


Hi Rashi,

I saw your latest update. Nice work! I have one comment about it though. Is there a way to animate the steering wheel such that it wouldn’t knock the driver’s coffee out of their hands while it transforms? I think the concept is great but maybe having the steering wheel move towards the driver while in transition could be dangerous.

Good luck! :slight_smile:


Yes you are absolutely right!

The mechanism is still bit sketchy ,but I think if we can do a proper anthropocentric study, and also by coming up with a better motion for the steering handles. The retracting process can be made where it will be less distracting for the driver.

Thanks for pointing that out! :thumbsup:

Best of luck!