News

3D printing with multiple filaments

Emerging Gizmology has a 3D printer now, the Monoprice select mini v2. After 3D printing for a while, you realize that your prints can go wrong in all kinds of ways.

  • The space around your printer head can fill with filament and jam up causing the buildup to burn against the printer head and prevent you from printing until you get it UN-jammed.
  • Your prints can look like absolute spaghetti. (and possibly jam your printer head)
  • The mechanism feeding filament to your printer may fail to feed softer filaments to the head.
  • Your prints might fail to adhere to the printer bed resulting in spaghetti.
  • Your prints might print fine up until the end and give you spaghetti (and possibly jam your printer head)
  • Parts of your print might not print.
  • Your 3D printer may become sentient and might even err in ways you’ve never seen before.

To prevent most of these errors from happening to our 3D prints we’ve created a best practices page to reference right before printing.

Be patient- Most 3D printing takes a long time, it’s important to come to terms with that before you print and inevitably fail at least once.

Always know your nozzle head and bed temperature- Printing at the wrong temperature because you assumed the temperature is a quick way to jam your printer head. Check filament temperature before you print online, it will save you time in the long run.

Use raft build plate adhesion- There are many different build plate adhesion types, this is the most resource intensive but, from my experience, it is also the most effective at keeping your prints attached to the bed.

Wipe build plate before a print- Oil from your fingers will prevent your prints from attaching to the plate, microfiber cloths should pick up some of that oil.

Preheat- You can print without preheating, but preheating before a print will help the raft print to the bed.

This is not on the best practices list but it is VITAL to printing with softer materials.

When printing with softer materials like rubber be sure to turn off “Enable retraction”- Enable retraction is great for hard materials like PLA because it prevents uneven threads from being printed.

 

Filming with a 360 camera – After Effects

A screenshot from a video of me skating across UTDallas with a monopod and Samsung Gear 360 camera

It’s easy to skate around campus with a 360 camera and get good shots, but it isn’t easy to edit the footage afterwards. To get the footage to look like this I had to convert this double fish-eyed lens into something that covers the whole screen, an equirectangular image by “stitching” them together. The camera that I used, the Samsung gear 360, does not give you an equirectangular image. You can easily transform your images and video to an equirectangular if you are a samsung gear 360 owner with the original codes that the hardware comes with by using the CyberLink ActionDirector, but I lost that.

Sample Equirectangular image

To Stitch in After Effects you can follow this tutorial. After you have an equirectangular image, you can upload this footage to youtube as a 360 video if you update it’s metadata with a python script or with a 360 video metadata app, or you can make the video wrap around into a sphere and make tiny planets like the picture of me on my skateboard. You can achieve this by doing a “tiny planet” effect, I followed this tutorial from Wren from Corridor Digital. 360 is a medium for storytelling that directly engages the viewer to look around, this makes it hard to tell a linear story and forces you to think about every scene more spatially. I’m curious to see what stories can come out of 360 video as a medium, but for now I just plan on making tiny planets.

Tinker Lab Project – The Reading Minion

Here in the School of Media Studies Tinker Lab (AKA, the Tinker Palace) we’re working on experimenting with Optical Character Recognition. Utilizing open source software, a web cam, pcDuino, and a Minion doll, we’re building a system capable of recognizing printed text and reading it aloud.

20160202_16392720160202_16405120160202_17043420160203_13594320160203_13595720160203_14204320160203_142053

Check out the Minion in action, reading Shakespeare!

Many thanks to Eva Jacobus and Dale MacDonald.

 

Summary of experience // Let’s SEE the Trash @ Ideas City

We learned alot about our Let’s SEE the Trash project by entertaining many visitors in our booth at the Ideas City Festival this past weekend.  In total, approximately 50 people visited us, who took roughly 30 of our bookmark business cards with QR code, designed by SMS graphic designer Chad Phillips.

letsseethetrash_bookmark_frontletsseethetrash_bookmark_back

From those visitors, our embedded video site received approximately 20 unique visits. Of those visitors, we did not receive any repeat visits to our booth.

 

 

We found it very useful being able to situate the project in public to discuss with visitors their impressions of what they thought our work would be based on how we described it. The process of our attempting to explain what the piece was, how it worked, and what the goal was in the greater context of the festival was very useful on our fine-tuning how we described the piece. Our description drifted away from mobile augmented-reality app towards location based documentary. Several visitors inquired as to whether we were able to physically track garbage across a large distance, or if we were able to obtain any data about the phenomenon we were attempting to depict. We were fortunate to be visited by a Department of Sanitation worker attending the festival, who provided us useful insight towards who to contact and how the department handles producing media pieces about their work. He stressed that the public should be made more aware about the process that garbage goes through after its initial disposal.

 

After our experience at Ideas City, the team feels that this was more a first iteration of the project rather than a finished product. That being said, we’re definitely proud of the technology we were able to develop for this first iteration of the piece that included: GPS detection and real time updating in a web based app using Google Maps API , applying custom map styling, geo-fencing points of interest, reactive points of interest icons, and custom video playback using YouTube API.  We plan to reach out to our new contact in the Department of Sanitation in an effort to involve them. Additionally, we think it would be beneficial to rework our description of the piece to that of a location based documentary. This describes more the intention of our piece given it’s current technological approach. In describing the project as documentary, the footage currently employed would need to be reconceived and reshot, most likely with the help of Red Dog Productions. Finally, we found that if we were to include a URL in our promotional material, that the URL be shortened by goo.gl or bitly.

 

-David Wilson
Research Assistant
Public Interactives Research Team

Summary of experience // Temperament of Space @ Dawn of Summer event

I just wanted to send along a quick summary of my experience with Curiosity of Temperament of Space during the Dawn of Summer event at The New School University Center on Friday 5/1, specifically highlighting areas where I feel the piece was really successful, our challenges where the piece might be able to improve, and other general observations.
I would estimate the age range of students who were experiencing the piece as 18 to 27. The ratio of female visitors to male is what I would call 3 to 1, for every 3 female visitors, there would be one male visitor. I would estimate the average length of experience with the piece at between 1 to 3 minutes, with the short outliers around 30 seconds, while the longest stay with the piece were two students constantly interacting with it for well over 30 minutes (more on these two later). In total I believe the piece saw between 60 and 70 unique visitors over the course of the 12 hours.
Feedback that I received from visitors was overwhelmingly positive. I took questions about who on campus was responsible for the piece, and found myself describing the nature of the research, and also the Public Interactives Research Team. Many visitors had specific questions about the technology and software that were employed, and how each were working in specifics to “see” or “detect” them.
< Successes >
Many visitors were drawn in by the visual aesthetics of the piece, to be pleasantly surprised by the interactive audio element. Many commented that the piece was “relaxing” and “meditative”, that the audio and visual elements were “beautiful”. Some visitors wanted to know where in the world the natural imagery was taken from. Some visitors, even though the interaction design meant that the interaction with the piece was “slow” rather than a direct mirror of their movements, wanted to DANCE in the piece. At least 10 visitors over the course of the evening did this. Watching these visitors, I got the sense that their perception was as if the piece augmented, rather than mirrored, their movement. One group of students, actors from the drama school, mentioned that the piece could function well as a teaching aid (act out what you see, and what you see acts with you).
< Challenges >
Many visitors had to be prompted to enter INTO the piece to interact with it. At one point, Dale taped arrows on the ground in an attempt to help people into the space, then into the piece to begin the interaction (unfortunately, it didn’t help much). I found that if I greeted guests and told them to walk in a general direction and that “something cool” would happen, visitors took that as a general invitation to enter the piece. I also thought that general instructions kept visitors in the space longer as opposed to directing them specifically and telling them what would happen.
< the Outliers >
Two Parsons fashion design students, undergrad juniors, one male, one female, spent a very long time with the piece, and in the room in general. Each of them commented that the piece was relaxing and that they enjoyed the pace at which the piece interacted with them. Each told me that they were very stressed out by their final projects and that the piece helped them to relax in a really engaging, but not ‘lame’ way. They made fun of the dance party happening in another part of the building, and that Temp of Space provided them with a great social/technological alternative. Them seemed to enjoy watching other visitors interact with the piece. The male student was very interested with the technology and commented that he wished the fashion department would incorporate reactive elements into their design curriculum.
< Suggestions for further development >
* a random mechanism with the audio distance detection, where once a users distance is detected the upper and lower bounds of the ultrasonic sensor are altered slightly (for instance +/- 5), to prevent users from finding a zone that does something specific. Essentially, to contextually randomize the experience further to promote further engagement and wandering within the space.
* We should consider strategies for shepherding users/visitors INTO the space, eliminating (or reducing) the need for a person to be there with the piece to guide visitors inside.
-David Wilson
Research Assistant
Public Interactives Research Team

Ideas City 2015 Project

PiRT is currently working on a project to be included in the Ideas City Festival 2015 at New Museum. The project is a locative experience that makes visible the invisible process of garbage collection.

We’ll post more about our progress soon.