Misadventures with Dragons and Deep Learning

Date
Feb, 16, 2021

My daughter is writing a book about dragons, so I thought I would help her make a cover for her book. Since I can’t draw, what is my natural recourse? Deep learning, of course! For those that are not familiar, deep learning is a branch of machine learning where you train artificial neural networks with lots of layers, hence “deep.” Lately there’s been a bit of buzz around generating fake celebrity images, with deep learning producing some pretty convincing images. You give the deep learning model on the order of hundreds of thousands of images, and then it learns how to generate non-existent people.

My idea was to use this same concept for generating dragons. I downloaded about 9000 images of dragons. It is probably not ready for prime time yet, but the code I used to download the images is located on github: https://github.com/elgood/CoverGenerator. Now, 9000 is a lot less than the CelebA Dataset, which has 200K, which may explain some of my lack of success in generating dragons. Also, the CelebA data is a lot cleaner; the images have been cropped to just the face. I deleted a few images that were obviously not dragons, but there is still a lot of data cleaning work to be done.

So, I took my 9K images and trained a relatively simple network. The input size was down-sampled to be 64×64 and the output image was also 64×64, so not big enough to be useful at this point. I used Amazon Web Services, selecting the following:

  • Instance AMI: Deep Learning AMI (Ubuntu 18.04) Version 40.0 – ami-072519eedc1730252
  • Instance type: p2.xlarge – This has 4 cores and 1 gpu.

After about three days and 600 epochs, the image at the top are some of the best that it produced. So, yeah, not the best. The two on the bottom right kind of look like cards, and some of the images were of dragons within trading cards. Some kind of look like dragons with two wings, but you kind of have to squint. I probably needed to train longer, but after three days, I had already spent $30 on the Amazon instance. There are things I can do to improve my time spent on AWS:

  • I wasn’t using any workers for the data loader. I did a little experiment before I terminated the AWS instance and found that I could decrease my time per epoch about 2.25 times if I used 4 workers (graph below). Next time I’ll use more workers so I can better utilize the gpu.
  • I was using an on demand instance, whereas a spot instance is about 70% cheaper. To use a spot instance, I’ll need a way of recovering the model should my instance be terminated prematurely. Once I have that mechanism, I can safely use a spot instance and save money that way.

Looks like it won’t be as easy as I thought to create some realistic dragons, but there are still lots of things to try. It should be fun!

Shows that epoch time can be improved with more data loader workers.
February 11, 2021

Jupiter Magna

Comments are closed.

Related Posts

On Sale Now!