Art, probably

sunwukung
9 min readJun 5, 2022

--

Screenshots of the output from the project.

In the winter of 2021, an old colleague of mine, Toby Eglesfield (https://twitter.com/notyetmeasured), got in touch and suggested a collaboration on an NFT project. A bit of background: I used to be an artist for a video-game company, and Toby was the lead artist on my team. Many years later, I switched career to become a software engineer, while Toby emigrated to New Zealand where he now works as a freelance designer/teacher. We’d kept in touch over the years, but serendipity and the burgeoning NFT market brought us together. Toby would be providing the artistic inspiration, while I would take care of coding duties.

FX Hash

Toby was very active in various tezos NFT communities, like hicetnunc (now OBJKT) and teia, and had been exploring a new tezos based platform, FXHash. The FXHash community is primarily focused on procedural/generative art, but Toby had some interesting ideas about creating a PFP (profile picture) project that leveraged some of the platforms capabilities.

We decided to name the project “Fizzogs”, a contraction of the word physiognomy. Toby had created a collection of assets with a really distinctive style and palette. Here’s an example of one of them below:

A happy fizzog!

Project Setup

The first thing I needed to do was create a development environment. Luckily, the team at FXHash have a handy webpack boilerplate project:

Toby provided a collection of SVG assets that could be used to build the images out of layers. There were 18 assets each for the torso, head, hair, nose, eyebrows and mouth. In addition, there were 11 unique facial expressions which could be flipped horizontally, as well as 32 colour palettes.

The next task was to select these items to compose them into a face. It wasn’t sufficient to just randomly select images, as this would create a non-deterministic output, which would break the contract required by FXHash. The boilerplate project above provides a utility — fxrand — to obtain random values. This returns a floating integer between 0 and 1. In a local environment, this value is always random — but when the project is deployed, the values this function returns become deterministic, guaranteeing the same output in a token. The trick was using this floating integer to select from the options above. Initially I wrote my own method that would quantise this value into a range for selection — but again, the FXHash community had solved this problem and provided a library of methods for selecting options or generating values:

You can see an example of the code I wrote using this library to select the svg files below:

In the snippet above you can see that I make a distinction between getting a locked or dynamic variant. This was to enable me to lock the selected file during development by passing in various debugging options.

Rendering

Once I was able to select the assets and palette — I needed to find a way of rendering them. I looked at a range of SVG libraries, including paper.js, Snap.svg and SVG.js. In the end, I chose to use two.js as it was lightweight and had an active community. Two provides a simple method of loading SVG assets into the scene — here’s an example of how I used this with the options I’d selected in the earlier snippet:

Positioning and Vectors

With all the face elements correctly placed, I needed to apply offsets to the elements to position them according to the loaded expression. One of the interesting things I discovered about working with SVGs at this point is that the co-ordinate space of a parent doesn’t propagate down to its children. For example, take the SVG below.

I scaled the parent element by a percentage to make it fit an 800 x 800 box. However, when I later needed to reach inside this to manipulate one of the children — they behaved as if they were still inside a 2048 x 2048 space.

With this in mind, I utilised two.js’ vector positioning capability to add the offset to its current position, which was simpler than calculating the absolute co-ordinates — and would allow us to blend these expression positions if we needed to.

Animation

With all the assets loaded and positioned correctly, it was time to apply the animations to the elements. This was by far the most laborious step in creating this project. Toby had created a corresponding animation for each element which had to be hand-coded. I chose to use tween.js as the animation library — as there were working examples on the two.js doc site.

This presented me with the next challenge when working with SVGs. Many of the animations required the origin point of the SVG to be moved off-centre. Some of the SVG libraries earlier provide a means to do this, but two.js, being as lightweight as it is, requires you to do this manually. The technique involved repositioning the vertices of the elements within the SVG, and then moving the entire SVG back to its original location. I wrote a utility to do this that allowed me to specify a compass point to lock the origin point to:

Note that I had to get the bounding client rectangle of the svg before calling el.center(). This was to prevent the values returned by this method being “corrupted”.

Finally, I had to determine keyframes in tween.js, which involved creating new tweens and chaining them together. I chose to use percentages of the global frame budget that meant we could tweak the duration of the animations to be shorter or faster if we needed to. You can see an example of an animation for one of the heads below — which makes a call to the repositioning utility above:

By and large animation was pretty straightforward, but some aspects of it needed me to brush up on my trig to determine the path or rotation point of an element in the scene.

Features

The final touch was to define a “features” object for the token. This is a way of relaying metadata about the project to FXHash, so it can tell how often a particular feature of the project has been minted, and thus its rarity. For example, in our project, each of the core elements had 18 variations — so I needed to report which one had been loaded in a generated token.

We could have simply gone with declaring which numbered variant had been loaded, but we both thought this was a little dry. So we opted instead to give each individual element an alias, which added another opportunity for some creativity. We also declared the facial expressions with unique “moods”, like “Chipper”, or “Goading”.

Finally, I created a pseudo DNA string for the particular combination of elements that were used in the face — the “fizzog” if you will. This involved selecting 1–18 from a string of alternating consonants and vowels to create a string. So for torso #7, face #9 and hair #12 you would get a DNA string of “J-O-P”.

A Note on Testing

Yeah, about that. As a software engineer, releasing something into production without tests feels like jumping out of a plane without a parachute. So with heavy heart, I must confess that I didn’t write any. Don’t judge me!

However, barring a couple of utility functions, most of the work was handled by libraries, so testing would have largely involved a lot of mocking for not much benefit. The real crux of the problem was the probabilistic nature of the medium — how do you test something that has randomness built into it?

To that end, I did whip together a small script to soak test the app, refreshing it thousands of times and capturing screenshots so that we could eyeball the output and see if any were broken.

You can see a screenshot of the output below.

so many fizzogs!

Screen Capture

One final challenge before we finished the project was to provide an easy way for users to capture images from the mint. This is a relatively easy task when using canvas as you can just dump the image data from the canvas out — but this is not so straightforward with SVG. Two.js has a canvas rendering mode that I tried to use, but given we had made use of mix-blend modes in CSS, this was lost in the rendering. Then I tried to apply the mix-blend modes directly to canvas at draw time, rather than via CSS. This nearly worked — but each successive shape in a layer blended with itself, creating an overlapping effect rather than the solid colours you can see in the existing work.

Luckily, NPM provides — and I found _another_ library that I wired up to a keypress which did the trick: https://www.npmjs.com/package/dom-to-image

Reflection

At the time when we started this project, the NFT market was undergoing something of a gold rush — which has since collapsed along with the rest of the crypto market (for the time being…). However, neither of us were really motivated financially in creating this project, or we would probably have deployed the project on a more visible or lucrative platform like Opensea. We made a conscious choice not to use popular proof-of-work platforms due to their energy consumption (given my day job at a sustainable energy company, that wouldn’t wash!).

The tezos NFT community has an underground vibe, a potent mix of people from disparate walks of life. It’s punk, outsider, raw. The fact that the art they make is on the blockchain has become secondary to the movement that has emerged.

The real motivation was to create something using this new technology and take part in the growing community of coders and artists working in this area. The domain of generative art requires a unique combination of programming skill and artistic expression — and it’s a fascinating thing to see contributors develop in their use and understanding of the medium, sharing algorithms one day, and artistic inspiration the next.

Personally, it’s been a great experience which has required me to delve deeper into subjects like trigonometry, statistics, and probability. There are something like 28 billion possible unique combinations possible within the project — but a 1 in 18 chance that you’ll see any given element within the project. We considered adding some more easter eggs if you rolled a unique combination, but decided we’d save some of these thoughts for another day. The act of collaboration itself posed challenges, given that we are on opposite sides of the planet — resulting in fevered midnight chats or yawning dawn zoom calls to iron out bugs and discuss features.

It was also an opportunity to gain a better understanding of the NFT/blockchain ecosystem and to explore the various movements and artists that are busy working off the beaten track, creating some fascinating digital artefacts.

But most importantly, I got to to create a cool piece of art with an old friend.

I hope this article has helped anyone who is interested in getting involved in this scene. Thanks for reading!

You can see the Fizzogs project here: https://www.fxhash.xyz/generative/14239

You can find our twitter profiles here: strictequals & measureless

--

--

sunwukung
sunwukung

Written by sunwukung

just a guy that lives by the woods

No responses yet