Filter: #Design view all
Design and media art projects.

Wayback Time Machine

The Wayback Time Machine provides a way to visually explore the evolution of a website. It is inspired by Apple's 3d visual language in iOS and Mac OS.

It is powered by the Internet Archive's Wayback Machine. See the Wayback Machine CDX API.

Created with Jono Brandel at the 2017 Internet Archive Experiments Hackathon. See source code on Github.

A microservice built with NodeJS and Chromium headless generates the screenshots from the Wayback Machine. They are rendered in a 3d axis with ThreeJS.

Site: http://wayback-timemachine.pages.archivelab.org?q=nytimes.com
Technologies: Html, React, Chromium Headless, WebGL / Three.JS

Gifcities.org

"GifCities: The GeoCities Animated Gif Search Engine was a special project of the Internet Archive done as part of our 20th Anniversary to highlight and celebrate fun aspects of the amazing history of the web as represented in the web archive and the Wayback Machine."

GifCities began as a side project by my colleagues Jefferson and Vinay. They had extracted all the animated GIFS from GeoCities from the Wayback Machine. I saw an opportunity to develop a user interface for this search index and developed it over the weekend. It later became an official project of the Internet Archive, and we spent more time refining it and releasing it to the public.

Site: https://gifcities.org
Technologies: Html, React

Press:
- https://techcrunch.com/2016/10/27/gifcities-is-a-search-engine-for-vintage-gifs-from-the-90s/
- https://blog.archive.org/2016/11/01/gifcities-the-geocities-animated-gif-search-engine/
- https://www.producthunt.com/posts/gifcities
- https://boingboing.net/2016/10/27/gifcities-a-search-engine-for.html
- https://lifehacker.com/find-animated-gifs-from-the-early-web-with-gifcities-1788333461

Archive Experiments

Inspired by Google's Chrome Experiments, Archive Experiments is a showcase of community made experiments built with data and services from Archive.org.

I thought of, designed, developed, and maintain this showcase. It started as an idea and side project, and now has been endorsed by the Internet Archive. In September 2017, an Archive Experments Hackathon was held at the Internet Archive.

Site: https://experiments.archivelab.org
Technologies: Html

Announcing x-gui

In March 2016 I presented at the Sandstorm SF Meetup. I shared the design process that went into creating TextEditor, and I extrapolated a set of design guidelines for creating open source apps. As a followup to this, I am anouncing x-gui, a library of web components for building consistent web apps.

X-gui is an experiment and is evolving rapidly as I prototype more apps to learn what components are needed. It could be compared to Google's Polymer Catalog, but key differences being it is built without a library like Polymer, and it has a completely different visual style (doesn't look like Google).

You can read more about the project on the x-gui github. The readme has a lot more info. There's also an online demo.

Want to start using it?

bower install x-gui/x-gui
# or
git clone git@github.com:x-gui/x-gui.git

Below are the slides from the talk.

Object Photography

(Above: Some photographs in this series. Scroll to the right.)

In late 2011 I began photographing (mostly electronic) objects. Most of these are objects that I have owned and used and have some affinity for. A good photograph can capture the essence and convey the qualities that the owner saw in this object.

I end up selling some of these things on Craigslist or Ebay. This is preferred over throwing things away in the trash. Selling objects through these services provides a good sense of closure, and the photograph becomes an artifact and means to hold onto the object without having the object itself.

Instead of having a closet full of things, these things can be discarded, but kept alive forever through a photograph in a photo album.

Materials: Retired Electronics, Nikon D100 Digital SLR, Makeshift photoset

Non Projections

Using a custom VJ software I developed, I performed visuals at a couple of the early Nonprojects record label shows. The software I developed let me mix in a live video feed from a wireless camera which was mounted on the ceiling and pointed down at the performers.

However, after performing visuals twice, I concluded I didn't like it, and prefer performing music.

I released the software as open source on github. You can download it here: https://github.com/rchrd2/nonprojector

Tutorial Videos:
- https://www.youtube.com/watch?v=0FRYPPk-Ogk
- https://www.youtube.com/watch?v=Glr_HuwdWWU

Materials: Wireless camera, midi controller, projector, custom VJ software
June 2010

The B152's

DESMA 152B was a freeform class was led by Professor Chandler McWilliams. The premise was make something using alternative computer interfaces. I took a leadership role and guided the class in the direction towards creating musical instruments and forming a group to perform with our inventions. I developed a conducting language could be used for improvised group performances. Gestures could signal individual performers to navigate in a predetermined musical set of possibilities. At the senior show, I took the role of conductor, while Professor Chandler McWiliams performed on my Photokoto. Our conducting language was influenced by Walter Thompson's "Sound Painting" language. It was also influenced by my experience performing with Synthia Payne.

Links:
- http://en.wikipedia.org/wiki/Soundpainting
- Class website http://classes.dma.ucla.edu/Spring09/152BC/?cat=13

Additional Files:
- Guidelines.pdf
- Video (coming soon)

Final performance on June 4, 2009