Snapchat Lens Creator Spotlight: Pam Taylor

Pam Taylor is just 16, but already making waves in the AR world. A natural creative, she’s made lenses based on breakfast cereal, makeup trends, and dream vacations.

I was able to catch up with the well-spoken, talented creator who’s already an official Lens Studio Creator and a soon-to-be AR megastar.

Q: How long have you been using Snapchat? When did you start making your own lenses?

A: I’ve been using Snapchat since 2017. I started my lens creator journey on March 1st, 2020.

Q: Which lenses (of ones you’ve made or others) are your current favorites?

A: As of now, “Pumpkin Glitch,” “Pam’s Glasses,” and “2 Eyebrow Slits” are some of my favorites.

Q: What do you like about creating lenses and seeing people use them?

A: Making lenses is my outlet for expression, and I’m fully in control of the final result. If I want something to look a certain way, it’s completely up to me, my vision, and the skills I use to get there. Having an idea, plotting out the steps, and seeing it through to completion is very fulfilling. What makes it even more worthwhile is seeing people interact with my lenses and using them consistently and sharing them with friends. It’s an awesome feeling to see these creations being well received. Sometimes, when making lenses, things don’t always go as planned — sometimes you incorporate those happy accidents into a new lens or the lens evolves as you make it.

Q: In what ways do your lenses currently use machine learning?

A: Most of my lenses are Style Transfer ML-based lenses, like “Blue Phoenix.” I also incorporate the use of Lens Studio’s Foot Tracking model in my lens “Chattery Feet.” My lenses “Calavera” and “Pumpkin Glitch” both use Style Transfer ML and Blend Shapes using Lens Studio’s Facial Expressions guide. I have some eyebrow lenses that use a form of Style Transfer ML as well.

Q: And did you build those ML models yourself?

A: I haven’t written my own notebook to train my models, but I’ve trained all of the models I’ve used for Style Transfer lenses. For most of the lenses, I’ve used Lens Studio’s Style Transfer Google Colab Notebook and for the others I’ve used some notebooks I’ve found online. I’ve tweaked the code a bit in the notebooks to vary in the output I get. It’s fun seeing how a couple lines of code can change the outcome, or even something as simple as playing around with the weights, loss, and training steps.

Q: As a Lens Studio Creator, what opportunities do you think a tool like SnapML provides?

A: Before Snap’s partner summit last year, I’d never seen ML before. Seeing the capabilities it gave lenses that day opened my eyes to a whole other world of possible lenses I could create. Seeing that made me excited to start creating, so much so that I decided to invest in the stock market to buy a new computer to be able to increase my lens creating ability. Machine learning provided a whole new avenue in creation — it added a new superpower to Lens Studio by being able to handle these types of file imports. It shook the community a bit too, because it told creators that Lens Studio is the place to be, because it has SnapML

Q: Do you think Snapchat will only increase in popularity in the future or die off like other social media platforms have?

A: Over time, I think Snapchat will age well [and so will] SnapML — it really established itself as a force to be reckoned with in the AR industry. Snap has continually increased in daily active users and it’s a fun app to use, so I don’t see it going anywhere. Snap is a one-stop-shop for content: talking to your friends, playing games, and a wide variety of AR experiences.

Q: Where do you see your career going — how important is it for you to be a digital creator?

A: Being 16, I still have a lot of time to decide what I want to do, and it gives me time to fully develop my AR skills before making a career choice. I’m glad I have such an early start because I do believe I want to end up pursuing AR to some extent.

Avatar photo


Our team has been at the forefront of Artificial Intelligence and Machine Learning research for more than 15 years and we're using our collective intelligence to help others learn, understand and grow using these new technologies in ethical and sustainable ways.

Comments 0 Responses

Leave a Reply

Your email address will not be published. Required fields are marked *