Skip to content

Prototyping for Design

Robots

Today we had an introduction to robotics and learned about robotic arms. The benefits of using a robotic arm versus other methods of manufacturing is the flexibility in multiple axises. Similar to a CNC, it has multiple end effectors however provides other opportunities such as antigravity printing. Through Grasshopper we saw how to create simulations and played around with scripts to see how the parameters could be controlled and modified. When creating a path to join points together it also can be made through determining the joing curve between the points, verses a linear path. This curve, or spline, should also be converted to have each point determined with the corrosponding plane. In the Grasshopper simulation file, I was able to download the plugins and libraries for robotics. I was able to run the script that draws a line in space. The images for running the scrips can be found below.

Blender

In this class we learned the origins of Blender, the community behind it, and the variety of applications that can be done using it. In picking a design tool, it is important to pick a tool that can you can learn from and be applied in multiple ways. For Blender it is used for animation, CAD, rendering, and can be connected to a multitude of devices and interfaces. There are a lot of tools that sell instant gratifaction of making something easy, and often times we donnot want to spend the time to learn a tool because it is not intuitive. However, this easyness that is offered is more about wanting a quick solution and to boost the user’s ego. In the end, we know less and less about how things are working and because of this are limiting our overall potential of skills and making.

Interfaces

This course we learned about live coding and used a visualizing tool called Hydra to learn how we can manipulate and create our own live visuals. We were able to edit existing visuals, write from scratch, use our own cameras and have the visuals be controlled by sound inputs. Coming from a non-coding background it was helpful to see how each command would change the visuals and parameters. It helped to understand the direct relationship and possibilities of what could be made using these tools.

Blender + Interfaces

During this session we started using Blender and learned basic functions in moving models in space and created an array. We were able to to this using coding commands which showed the foundation of how Blender works. Unlike other CAD and rendering programs, Blender has a lot of ways to interact with it. We then saw how Arduino can be connected to Blender, using a switch modeled in Blender it could control a light on Arduino. In the images below it shows the demo of the Blender interface that would control the light switch. We had some difficulty in setting up the light on Arduino, however it was good to see another way that these tools could be used together and applied. Overall it made me interested in learning Blender.

Microchallenge III

Full repo of our process can be seen here: https://github.com/samipiercy/futurecravings_microchallenge3

For the third Microchallenge, I worked with June and Josephine to create interactive posters for our intervention Future Cravings. For our final intervention we will be having an installation and drinks instead of a dinner, which Future Cravings focuses on. We wanted a way to show the dinner experience and for people to access more information rather than just through a flyer. We had been interested in learning AR tools and Blender, and so we decided to take this week to learn and build some AR experiences.

The program prompts you from an image to the AR model: hackmd_1

It first took research to find an AR tool that would allow anyone to scan and be promted to the AR model. We wanted the experience to be easy to use and many tools need participants to download an app. We started with A-Frame, which is a great tool because it is open source. However, we were having difficulty in adding in our 3D model so we moved over to Blippar. This tool allows you to drop in images or models and animate inside the program itself. We ended up with three prototypes for AR experiences. The first was a Rhino model that was animated inside of Blippar, where the light and movements could be manipulated. Second, was taking the Rhino model and animating it in Blender. This video was then placed in Blippar, so after scanning an image you would be prompted to the video. The third was using Adobe Aero and MetaSparks, where the animation could also be manipulated in the program.

Here you see the Rhino model being animated inside of Blippar:

This is an example of the poster, and Blender animation:

The next steps would be to animate inside Blender, but place the animated object into Blippar or A-Frame. This would then allow for the AR model to appear like it is growing from the image that is scanned. There was difficulty in getting the animation to transfer from Blender to Blippar.

Overall we were able to experiment a lot in learning about different programs. As a group it was also great to each experiment on our own and come together to share knowledge and assets. I also enjoyed learning Blender and would like to continue exploring this program. Beyond the posters, AR could be used to illustrate stories and information. For example, when we are showing biomaterial tableware the AR could explain more information about recipes, or illustrate how the item might biodegrade. It helps to illustrate information in a more engaging way and can be edited and added onto easily. There is conflict with wanting people to be immersed in an experience and not isolated on their phones. However, if it is used to share information during specific times or used as a conversation starter then this isolation can be avoided. Even though these type of AR interactions have been around, there is still a element of fun and excitement that comes with it. It is also a nice way to show animation in an interactive way as you are able to look at it from different angles yourself and move it in the space around you. In this way it feels like the person interacting with it has some ownership or control over the experience.


Last update: June 14, 2023