BLOGS: My COW Blog Adobe Blog Editing Technology After Effects Final Cut Entertainment

After Effects

COW Blogs : Adobe Beyond Adobe : After Effects

Behind the scenes of “Word Crimes”

Talented artist uses Adobe After Effects to create fitting typographic animation for parody video

On Tuesday, July 15th, the most shared video on YouTube and Facebook was “Weird Al” Yankovic’s “Word Crimes,” a parody of Robin Thicke’s popular “Blurred Lines” single. With more than 12 million YouTube views and climbing, the song is both clever and catchy. But what really brings it to life is the video’s impressive typographic animation. Jarrett Heather, a software developer with the California Department of Food and Agriculture, spent 500 hours over three months working with Al Yankovic on the project, which relies heavily on Adobe After Effects, Photoshop, and Illustrator.



Adobe: How did you get involved with this project?
Heather: One day last November I opened my email, and there was a message from Al Yankovic. He had seen the “Shop Vac” video I created a few years ago and wanted to know if I’d be interested in a directing gig. I wrote back right away and let him know that I’d be honored to work on the project. He was still in the conceptual phase with the song, so I didn’t get the specs and lyrics until the first week in January.



Adobe: How did you get started?
Heather: In early January Al cut a demo together, which I used to create the animatic. I thought I would just get a click track with vocals tapped in, but it is a fully produced demo, with six or seven vocal tracks. After Effects was my canvas for designing the entire animatic. I pulled in clip art from the web, put the assets together in Photoshop, and just brought them in and started working.



Adobe: What type of direction did “Weird Al” Yankovic give you for the visuals?
Heather: At the start of the project he had a long list of specific visual ideas he wanted to see me try out. One thing I was grateful for is that he decided that the whole rap section should be done on a chalkboard; I love how it turned out. But some ideas didn’t jive with my vision for the video and he was really respectful of that fact. Overall, he was wonderful to work with.



Adobe: How did you come up with the other ideas for the visuals?
Heather: I started by reading the lyrics and imagining the look of the video. I knew the concept would be a contrast between old media textbooks and encyclopedias against new media interfaces, applications, and sites. Basically, I created a visual argument between the past and the present, the grammar police and the grammar criminals. I also looked at the “Blurred Lines” video a lot. That’s the video we were parodying and I wanted to bring as much in from the original video as I could. You can see that in the color palette and hashtag typography.




Adobe: Can you tell us more about how you utilized After Effects?
Heather: I had to do a little pre-analysis on the song before I got started, because three minutes and 44 seconds is too much to fit into one project file. I decided what the different sections would be and broke it into chunks. I used the type tool in After Effects to put the type on the canvas and went from there. The design process went pretty quickly. Every day I would do a test render of an animatic and send it to Al and he would give me feedback on the jokes and designs right away.

Adobe: Did you take any unique approaches with the animation?
Heather: I’m not a professional animator, so nothing technically crazy is going on in the video. Other animators may actually want to look at how often things aren’t moving in the animation. That alone may set the style apart from other typography videos. You can’t really read type while it is moving and you can’t read it while the camera is moving. I had to be careful about how much animation I put into it.

Adobe: What other Adobe products did you use on the “Word Crimes” project?
Heather: Most of the art assets were created in Photoshop or directly in After Effects. I also used Illustrator now and then. The less or fewer signage in the video was an illustration I created in Illustrator. I also used Illustrator to trace over the computer interface in the tweet animation to make it nice and sharp. I also used Adobe Premiere Pro to edit the whole thing together and Adobe Media Encoder to encode the final video.



Adobe: Tell us about your job.
Heather: I got started in design putting websites together. My job is mainly designing user interfaces for software, integrating user interfaces into the back end of software, and creating iOS and desktop apps. I mostly create graphics for the web, so Photoshop is my bread and butter, and I get to work with Adobe Creative Cloud on a daily basis. I’m lucky because my job gives me lots of opportunity to be creative.

Adobe: What’s next for you?
Heather: All my life I’ve been curious and I’ve learned to do different things. It turns out I’m good at a lot of them, but I don’t have any formal training in the arts. I enjoy working on side projects but I’m pretty picky about what I take on. I’m really happy to see how successful the video has been for Al and I want to see his Mandatory Fun album succeed, but I’m not looking to quit my day job! The Internet is a pretty fast moving stream these days, so it won’t be long until everyone is on to the next thing.

Learn more about Adobe Creative Cloud



Posted by: Adam Spiel on Oct 9, 2014 at 3:32:36 pm After Effects, Customers

Motion graphics and visual effects work shines at Sundance Film Festival

In the lead up to the Sundance Film Festival we had the good fortune to talk with a number of creative professionals involved in creating some of the great films premiering in Park City. From animated and live action shorts to feature-length documentaries and dramatic premieres, these films display a dizzying range of creativity and talent, as well as inspiring uses of motion graphics and visual effects. Me + Her, Hits, and The End of Eating Everything are three films in which Adobe After Effects played an important role.

Me + Her
Joseph Oxford started making Me + Her in 2009 when he was working as a production assistant. After creating characters from some pieces of cardboard, he started writing a script to tell their story. He developed the script and created the characters and set pieces by hand in his spare time. In 2013, he was finally able to shoot the live action short featuring animatronic rod puppetry.



The puppets that appear in the film were designed in Adobe Illustrator and mass produced so duplicates were available if needed. Oxford used Adobe Photoshop Lightroom to develop early color concepts to help determine the visual style before capturing the majority of content in camera.

All visual effects shots went through After Effects at some point, and the time-lapse sequence of the tree growing at the end of the film was the most labor intensive. While some elements were created in Maya, composting the live action and 3D content made it all feel like live action. Oxford and his team used the Roto Brush in After Effects to fill in missing sections of the sky and the 3D Camera Tracker to turn on 200 light bulbs in one scene. Overall, the six page script took 18 days to shoot, resulting in a whimsical short film about love and loss.

For more information, read the recently published Studio Daily article.

See more at http://www.cardboardfilm.com

Hits
Viewers who attend the screening of Hits aren't expecting to be wowed by visual effects. In fact, most would never guess that all of the YouTube screens that feature prominently in the film were built using After Effects. The production studio Final Cut worked on both editing and visual effects for the film by Screenwriter and Director David Cross.



Phil Brooks, a graphics and visual effects artist with Final Cut, was given free rein to recreate the YouTube site so he would have more control over the animation and camera moves. First, he used Illustrator to rebuild the user interfaces and buttons as vector graphics so they could scale as needed for shots. The screens were then taken into After Effects, where Brooks created most of the layouts and pages.

The magic of Photoshop enabled him to crop, prepare images, and remove people from backgrounds as needed. By creating invisible visual effects, Brooks effectively helped tell the film’s story without stealing the show.

The End of Eating Everything
The End of Eating Everything by Wangechi Mutu is a visually stunning short film that follows a creature through a vast atmosphere. Digital Artist Joaquin Jutt joined Mutu’s team mid-way through the project as an editor, applying his background in 3D modeling and animation to add more dimension and scale to the fine art film.

Using Adobe Photoshop, Jutt took existing screenshots from the project and composited textures and colors to adjust the overall feeling. Various elements in the atmosphere, such as smoke and birds, add depth and interest. Jutt rendered one bird, animated it flying and diving, created a loop, and then used Particular in After Effects to create the swarm of birds. Animation techniques were also used to make the tentacle on the creature’s head and the internal organs move, pulsate, and change as she moves and spins.



Editing the eight-minute short took eight months. By tackling each challenge individually, from matching the model’s skin tone to mapping the spinning actress to the animated model, Jutt and the small team of editors and animators helped create a film that was true to Mutu’s vision and point of view.

Watch the visual effects breakdown

Learn more about Adobe Creative Cloud

Download a free trial of Adobe Creative Cloud


Posted by: Adam Spiel on Jan 22, 2014 at 2:03:44 pm After Effects, Customers

Hasraf ‘HaZ’ Dulull continues to innovate with Adobe Creative Cloud

New animated short film for Universal Studios expands on the visual style of HaZ’s first film Fubar Redux

Our friend and Adobe Creative Cloud enthusiast Hasraf ‘HaZ’ Dulull was recently hired by Universal Studios to direct a motion comic as part of the marketing for 47 Ronin, starring Keanu Reeves. Working with the team at production studio DSF, HaZ created the animated short film 47 Ronin: The Samurai Spirit in the same style as his short film Fubar Redux.



The animated short expands on the visual style of Fubar Redux with DSF’s Hyper Motion Cinema format, which combines branded and original short form content and a mix of still photography, VFX, and animation. It’s created entirely with Creative Cloud, utilizing the Dynamic Link workflow between Adobe Premiere Pro CC and Adobe After Effects CC and Direct Link between Adobe Premiere Pro CC and Adobe SpeedGrade CC.

HaZ says the team at Universal Studios is very happy with the project. “The whole workflow with Adobe Creative Cloud made it such a fun project to do,” says HaZ. “We were not worried about technical pipeline but instead just pushed ourselves to see how far we could take the Hyper Motion Cinema animation format creatively using Adobe Photoshop, After Effects, Premiere Pro and SpeedGrade throughout.”

Watch the short here:

HaZ will be participating in the Adobe panel discussion: Engaging Story, Brilliant Visuals, Low Budget - the changing face of indepe... at the Sundance Film Festival. Stop by to meet him or click the link to register for the streaming panel discussion on Friday, January 17, 2014 from 2:00 PM - 3:30 PM PT.


Posted by: Adam Spiel on Jan 7, 2014 at 10:32:34 am After Effects, Customers

Joseph Kosinski’s film “Oblivion” showcases elegant effects created with Adobe After Effects

World‐class user interface designers, graphic artists, and animators create crisp, timeless visual elements for sci-fi film

When you want great visual effects for your latest futuristic film, you need to hire a great team. That’s exactly what Joseph Kosinski did for his latest film, Oblivion, starring Tom Cruise as a drone repairman working on an uninhabitable future Earth. Kosinski turned to Bradley G. Munkowitz (GMUNK), who he’d previously worked with on Tron Legacy. As lead interface graphic designer, GMUNK then pulled a team together that included Interface Graphic Designers Joseph Chan (Chanimal) and Jake Sargeant, and Interface Animators Alexander Perry (AP), Navarro Parker (Nav), and David Lewandowski (D‐Lew). They were happy to reunite to share how much fun they had creating the 2D effects for the film using an Adobe workflow. (Now available on DVD.)



Adobe: What was your vision as design director? How did you become involved?

GMUNK:
I had a prior relationship with Joe Kosinski because we worked together at Digital Domain doing some TV commercials for HUMMER. Then I did all the holograms for Tron Legacy. I’ve been a graphic artist for more than 12 years, and Joe respects my work. So when he asked if I was interested in Oblivion, there was no question in my mind.

Adobe:
How did the rest of the design and animation group come together?

GMUNK: I actually met Alex (AP) at a party and I knew he was fantastic at animating interfaces. Then we brought on Chanimal and Nav, some of the most amazing animators we’d ever known. Nav has been creating futuristic user interfaces (FUIs) for at least seven years, and Chanimal has a similar level of artistry and experience. This was such a unique gig because we actually assembled our little band together in Culver City right in Joe’s office, so we were sitting next to this famous director and his personal assistant. Every day felt like play, not work. And as design director, I tried to infuse some silly fun into the project. What was great about this project was that we could sketch out and present ideas and discuss them with Joe right there. He was accessible and amenable to a lot of back and forth. It was very fun, loose, and collaborative.

Adobe: What were the first steps on the project?

GMUNK: We wanted to create a very energetic and visually stunning sci‐fi world with a clean, sharp “vector‐ish” feel to it, so for the graphic design we used Adobe Illustrator and then brought the graphics into After Effects. There were two phases of the project. We started by creating an interactive light table for the sky tower. This was a big push with a team of four doing all the concepting, designing, animating, and outputting for the live action shoot of the sky tower. I worked on this part with D‐Lew, Chanimal, and motion graphics guru Jake Sergeant.



Next, we (Nav, Alex, Chanimal, and I) started on the post graphics, which included cockpit elements for the Bubble Ship and heads‐up‐display (HUD) elements for various equipment and weapons in the film. Throughout the whole process we used Illustrator, After Effects, and a little bit of CINEMA 4D. Keep in mind that this is not really a flashy 3D movie with all kinds of holograms and crazy 3D elements. It’s meant to look timeless, minimalist, and elegant, so Illustrator, After Effects, and CINEMA 4D were perfect because they gave it the clean, 2D look we wanted. We used a little Photoshop here and there for text and some Adobe Premiere Pro just for quick playbacks.

Adobe: Alex, tell us what you contributed to the project?

AP: I did a lot of the animations that you see just once in the movie such as a reticle (a grid or pattern placed in the eyepiece of an optical instrument) locking onto a drone. I worked a lot on the little effects on the gun reticles and drone vision that was undulating and looping back and forth. Working on graphics that are supposed to look like they have a function was a fun challenge. In terms of using After Effects, I kept it pretty traditional and used only a few effects. I love animating masks and assets from Illustrator because the workflow is straightforward and the integration is great.

Adobe: Nav, how did you get involved in the project and what was your primary role?

Nav: I’ve always loved computer screens in movies. I had worked on a commercial for Sony, co‐branded with the movie Skyfall, with GMUNK and we enjoyed collaborating. I was thrilled when he asked me to join the team on Oblivion. I primarily worked on the DNA sequence, the drone vision “eyeball,” and a ton of shots for the Bubble Ship HUD. I also assisted GMUNK with a weather display sequence that is part of the deleted scenes on the Blu‐ray.



Adobe: What were some of your main challenges and how did After Effects help you overcome them?

Nav: We were working at super high resolutions, mostly 4K with a handful of shots in 8K, because the goal was to keep the micro‐fine detail absolutely crisp and beautiful. Because After Effects has the ability to infinitely scale vector layers, we were able to scale up any vector file by massive amounts without losing detail. Then we handed shots off to Pixomondo for comping everything into the shots. After Effects also helped us work quickly. The global cache feature in After Effects makes it so once everything is initially rendered, a change to one layer can be rendered out much faster.

Adobe: As you move into using Adobe Creative Cloud more, are there any features you’re especially excited about?

Nav: Specifically for me, the integration between After Effects CC and CINEMA 4D is really useful because I won’t have to be jumping back and forth between applications. I can make faster changes and refinements and have more time to creatively experiment.

Adobe: What made this project special?

GMUNK: I am not sure anything quite like this will ever happen again, where a team like this comes together and we have so much close collaboration with a director like Joe Kosinski. I heard designer Jessica Walsh speak at a conference once and she said something along the lines of, “If you treat work as play, you’ll never work another day in your life. The people you work with, the fun you have, and the sheer joy in creating takes precedence over everything.” That’s what this project was like.



Learn more about the video tools and services in Adobe Creative Cloud
Download a free trial of Adobe Creative Cloud


Posted by: Adam Spiel on Nov 21, 2013 at 9:33:15 am After Effects, Customers

Upping the HUD ante with Adobe After Effects

Cantina Creative creates next-generation heads-up display effects for Marvel’s Iron Man 3

Stephen Lawes, creative director and co-owner of Cantina Creative and Cantina VFX Supervisor Venti Hristova raised the bar once again on the heads-up displays (HUDs) for Marvel’s Iron Man 3, which premiered in May and released on DVD in September. The film features Robert Downey Jr. as superhero Tony Stark, along with Gwyneth Paltrow, Don Cheadle, and other top stars in supporting roles. Lawes and Hristova’s primary task was the creation of 100 HUD-only shots in the film, used with multiple versions of Iron Man and other character’s suits. Lawes and Hristova discuss how they used Adobe After Effects to create the next-generation HUDs for Marvel’s latest superhero hit.

Adobe: Some people might not fully understand what a HUD is. Can you explain it to us in layman’s terms?

Lawes:
Basically, a HUD shows what Stark (Iron Man) is seeing as a graphical representation when he’s in one of his suits. It represents either what he sees from his point of view when he’s looking out, or what is reflected in his visor.



Adobe: What made Iron Man 3 especially challenging?

Hristova: We had to do 100 HUD-only shots, which is pretty daunting. Then we also discovered that in a very short period of time, the CG Iron Man suits had made a huge leap. We met with production VFX supervisor Chris Townsend and VFX producer Mark Soper in September of 2012 and learned that the armor-like CG suits used in movies like Marvel’s The Avengers and Iron Man had gone through multiple generations, from Mk7 to Mk42. They had broken new ground in terms of the number, design, and complexity of the suits, so the HUDs had to match the new suits seen in the film.

Adobe:
How did you go about taking the HUD experience to the next level?

Hristova: We needed to push the visual sense that 2D elements would really have a 3D feel, a physical presence like a hologram. We built two laser lights into the actual HUD, and were able to project holograms from them. We also gave the graphics a more tactile, textural feel. They glow in a glassy way that feels photorealistic. I know the word photorealistic can be misconstrued in many ways, but in this case it just meant that we really had to create the optical illusion of 2D objects genuinely feeling and looking 3D and even adding light streaks to the objects. We used both After Effects and MAXON Cinema4D to accomplish this. Toward the end, we started using the 3D capabilities in After Effects so that we wouldn’t have to re-render a shot every time we added a new dimension. That really sped up the process.



Adobe: When you’re working with HUDs, it’s very procedural and it’s crucial to maintain consistency from one graphics effect to another. That requires a lot of complex math. How did you deal with this in Iron Man 3?

Lawes:
After working on HUD VFX for three different movies, including The Avengers, we created a rig that sets up the math expressions so that they are embedded within the script. The rig uses 3-point tracking in After Effects that captures the two corners of the eyes and the tip of the nose and triangulates that data to create a virtual 3D space. The VFX artists can focus on being artists, not programmers, and that’s what we want to achieve. We also train each artist on how to maneuver, track, and animate the HUD with some intuitive booklets we’ve created so they don’t have to know the math behind it. One caveat: most of our VFX artists are experienced and have previous knowledge. We don’t want to take risks with the HUDs because they’re pretty difficult to wrap your head around (no pun intended). We averaged only about seven to a maximum of ten experienced artists over five months because we’d prefer to use fewer experienced artists, dedicated for longer periods of time.

Adobe: How did Cantina create the graphics shots?

Hristova: At Cantina, we work together to maximize our strengths and many artists contribute to most shots. The graphics on this project were created in After Effects with a team of artists and production staff that included Sean Cushing, Lily Shapiro, Alan Torres, Leon Nowlin, Matt Eaton, Aaron Eaton, Lukas Weyandt, Jon Ficcadenti, Johnny Likens, and Jayse Hansen.

Adobe: Did you also create the HUDs for other characters?

Hristova: Yes, we created a HUD for Rhodey and other characters. We did around 40 HUD CG shots just for Rhodey. Again, we combined mini-teams and assigned them to specific projects so that everyone could maximize his or her strengths.



Adobe: As you move into using Adobe After Effects CC, are there any features you’re especially excited about?

Lawes: Specifically for us, the integration of Cinema4D in After Effects CC is fantastic because we won’t have to be hopping back and forth between applications, and rendering time will be greatly minimized. Also super helpful for us in After Effects CC is the 3D tracking and the new scaling algorithms—both have improved by leaps and bounds. Adobe meets in person with us a couple of times each year to exchange ideas. We love it because we’re able to provide technical feedback into how the development team can help solve the issues we face, and the Adobe team is very savvy and responsive.

Adobe: Why is After Effects CC so well suited for what you do?

Hristova: I think in many ways, we’re the ideal customer for Adobe’s video tools. Our work spans motion graphics, as well as VFX for film and commercials. So we cover the same ground After Effects covers. We use After Effects 90% of the time in our workflow.

Adobe: What’s next for you?

Lawes: We are collaborating with our close friends and neighboring office mates, Bandito Brothers, on Need for Speed!, which is based on the popular Electronic Arts game and slated for release in February 2014. This is going to be huge, as in roughly 900 shots. We’ve got a couple of interesting commercials and movies coming up that we can’t talk about yet, too.

One of the things we’re most excited about is that Venti and I are starting to co-direct commercials that are spec projects with no client involved, so we’re letting our creativity run wild. We’re using more craft-oriented animation techniques, from hand-drawn pieces to stop-motion and 3G, incorporating more of a mixed media approach. Ultimately, this will encompass the launch of an entirely new spin-off company, Little Foot, and the productions will appear on social channels. It’s very entertaining for us, and, of course, creative freedom always makes us happy.



Learn more about the video apps and services in Adobe Creative Cloud


Download a free trial of Adobe Creative Cloud





Posted by: Adam Spiel on Nov 20, 2013 at 1:06:30 pm After Effects, Customers

VFX and motion graphics experts use Adobe After Effects on “Star Trek Into Darkness”

Andrew Kramer focused on movie titles while OOOii team created stunning graphics and heads-up displays for blockbuster film

Resurrecting the classic science fiction franchise of Star Trek is certainly a challenge that any motion graphics and VFX artists would gladly accept. For the latest installment, Star Trek Into Darkness, Andrew Kramer of Bad Robot and author/owner of the site Video Copilot was the lucky one tapped to create more than 30 title sequences for the movie. Production studio OOOii eagerly took on the job of designing all user interfaces and future technologies within the movie. Kramer, OOOii CEO Kent Demaine, and Lead Designer Jorge Almeida all shared their great experiences working on the latest seminal Star Trek film (now available on DVD).



Adobe: What were some of the unique challenges of creating titles for Star Trek Into Darkness?

Kramer: I’m sure the folks at OOOii will agree that one of the biggest hurdles was that the film was shot in stereo so everything had to be “true” 3D from start to finish to look realistic and fully dimensional. We used After Effects and Video Copilot’s Element 3D plug-in to create real 3D object-based particles within After Effects. We had a short timeline to create 30-plus titles, so we did not have the hours or need to delve into what people might consider a full-fledged 3D tool. The other challenge was that we had to stick to the same basic Star Trek aesthetic, but make everything more refined and pay strict attention to details.

Adobe: Tell us about the work you did on the film.

Kramer: With a short timeline to create and finalize the titles—filled with very long days and nights—we created more than 30 different titles. JJ Abrams, the film’s director, asked me to come up with a more refined, updated-look for the extreme space fly through title sequence. We had a small modeling team that created a library of high-quality 3D planets, from moons to asteroids and ice planets. The planets were color-corrected and we even designed a planet that broke into pieces of debris. We replicated the pieces hundreds of thousands of times and had them breaking off to give the scenes a sense of depth.



There were multiple “spacescapes” and we needed to make sure they were colorful, vivid, and tangible with debris, asteroids, lens flares, and so on. The cool thing about the lens flares is that they are built from scratch by filming them and then isolating each individual lens flare frame iris in Photoshop and then adding textures. We got some really fun effects. Ultimately, we wanted each title scene to have its own unique world. For example, we created one unique world with the sun burning with hot-looking tendrils as the core. We built a texture using animated fractal noises. As you can tell, we had a lot of fun and tried new techniques that we never had before. The ability to work natively in After Effects for a stereo workflow, not having to render out each planet individually, and being able to make changes dynamically that affected all of the designs was pretty critical.



Adobe: Kent, what was your team’s primary role?

Demaine: OOOii worked closely with the art team and the Production Designer Scott Chambliss to develop iconography and a visual language for this iconic film. Our interfaces and vision of future technologies helped define the revamped USS Enterprise and Starfleet and support the unique vision of the filmmakers. Our specialty is information technology design and human-technology interaction. This involves the creation of things such as holograms, digital signage, and interfaces on devices. Our use of this expertise in films helps provide a vision into new interaction modalities with technologies such as mobile phones—everything we do focuses on natural interaction models that happen between users and information.

Almeida: All of our designers are set up with full Adobe Creative Cloud pipelines. We started by creating some concept art that gave an idea of the style and then we designed based on those concepts, using the previous movie as a guide. Initial designs were done in Illustrator and Photoshop, but by the middle of the film when we had a lot of the elements built we would just go straight into After Effects. Once we hone in on a look everyone likes, we start showing animation tests. There is a lot of back and forth, but as the production moves along everything becomes more specific.



Adobe: Can you give us an idea of the scope of the work that OOOii did on the film?

Demaine: We created more than 300 production and post-production shots over a year’s time. We do the designs and we often composite them, too, but in this case ILM composited the graphics and HUDs into the scenes.

Adobe: Here’s a question for both you, Andrew and Kent: did you use Adobe Premiere Pro for any of your work on Star Trek Into Darkness?

Kramer: We used Adobe Premiere Pro for editing, timing, and prepping primarily. We would do all of our timing and figuring out the music beats, bring in the score, and lay in the still frame designs before we did the camera movement and final compositing. We also used Premiere Pro to determine reference points of different locations for animators so that everything moved in one flowing, orchestrated whole. This was important because there were thousands of layers involved because of the complexity of our stereo camera rig.

Also, one of the key things in the title sequence was getting the color right. For that, we used Adobe Bridge to open the screen shots of each planet design, for instance, and show them all on the screen at one time to see what color palette looked the best: blue highlights, green highlights, too much red, needed more monochrome, and so on. We used it as a color reference for final output before animation.



Demaine: We were traditionally an Avid or Final Cut Pro shop but we’re now exclusively Premiere Pro. We do a lot of plate acquisition through RED cameras, and with Premiere Pro we’re able to drop whatever we are given onto the timeline. We can get camera data from ILM and render out graphics using that information. Like Andrew, we also used Bridge as a file management tool because it let us preview files, visualize layers in files, and drop them into different programs with ease.

Adobe: What about Adobe Creative Cloud? Are you using it, and if so, what are you most enjoying about the newest version of Adobe’s cloud-based video and creative software?

Kramer: I’ve been using Creative Cloud for a while now and, although I know it’s not an entirely new feature, the Warp Stabilizer in After Effects is definitely more advanced and mature. The stabilization movements seem more fluid, and that saves a lot of time. Also, After Effects seem to work more responsively with plug-ins, with faster GL rendering speed, so that saves a lot of time. The quality is magnificent, akin to IMAX feature quality.

Almeida: I’m having a lot of fun using the 3D features in After Effects to build animations. I’ve also just started to dive into Cinema4D and look forward to exploring what that integration can do. The speed and flexibility of After Effects are what I like the most. I often get to the point in post where I’m designing directly in After Effects as much as anything else. I actually prefer it because I can start building things that are more of a finished product and After Effects just lets me design on the fly.



Demaine: Adobe Creative Cloud came along right when we were finalizing this project, but we were already beta testing it and we are impressed. We really like the instantaneous feature updates and more predictable pricing that Creative Cloud offers. I think it’s a good step forward that will let us take advantage of Adobe innovations even faster than before. We’ll also continue to venture into other areas of Adobe’s tools without additional charge. For example on this project, we worked with BlackBox Digital and used Adobe Flash Professional and Adobe AIR extensively to create a cool mobile app that revolves around some of the most impressive VFX scenes in the movie. We think this is just the beginning of what Adobe Creative Cloud will facilitate.

See how Andrew Kramer created the titles for Star Trek Into Darkness using After Effects and Element 3D.


Learn more about the video apps and services in Adobe Creative Cloud
Download a free trial of Adobe Creative Cloud


Posted by: Adam Spiel on Nov 19, 2013 at 11:10:30 am After Effects, Customers

Stargate Studios Sets the Bar for Visual Effects on Productions

Adobe Creative Cloud and automation create competitive edge for iconic production studio

Delivering content for some of televisions top shows, including House of Lies, Grey's Anatomy, and, of course The Walking Dead, Stargate Studios continues to break the mold of how feature films, television series, electronic games, and commercials are generated. The studio was founded in 1989 by Sam Nicholson, a distinguished cinematographer and visual effects supervisor with 30 years of expertise in film, television, and visual effects. The now international production company provides concept development, advanced production services, and state-of-the-art postproduction services.

In an industry that requires ever-more impressive productions to be created with the same or even smaller budgets than in years past, Stargate has developed a winning strategy. We had a chance to sit down with Sam Nicholson, CEO and founder, and Adam Ealovega, vice president of technology for Stargate Studios to learn more about their strategy and success.


Stargate Studios

Adobe: What do you think differentiates Stargate from other studios?

Nicholson: Over the years, Stargate has adopted the latest cameras, visual effects, editing, and other tools and pushed them to their limits. We have always stayed at the forefront of technology with developments such as our Virtual Backlot (VB), VB Library of virtual environments and stock locations, and VB Live process of real time compositing. We’ve also developed and refined a diverse array of proprietary production tools. And our people are the best of the best, with advanced skills in live action film, HD production, CGI, digital compositing, matte painting, and on-line HD editing.

We are working really hard toward streamlining the difficult process of visual effects. Most people think of VFX like an all-day root canal. They’re expensive. They’re slow. They’re painful. We’re working hard to change that. For instance, on The Walking Dead, we’re showing how creating zombies digitally can be faster and less expensive than physically producing them using prosthetics—although the award-winning prosthetics on The Walking Dead are impressive and deserving of the accolades they receive.


The Walking Dead Season 2 Visual Effects Reel

Adobe: What does being at the forefront of technology mean from practical and business standpoints?

Nicholson: We’ve been able to grow a company based in Los Angeles to a company where we now have facilities in Los Angeles, Toronto, Berlin, Vancouver, Malta, Atlanta, Dubai, and soon Cairo. And what’s amazing is that we can go into countries like Germany or places in the Middle East and win the business, where their cost of business and labor is much lower. We’re not just involved in doing blockbuster features with unlimited budgets. Instead, we’re doing a lot of international TV shows and blockbusters where budgets are very, very low—but at the same time, we refuse to sacrifice quality and creativity. One major benefit is our use of Adobe Creative Cloud. The software is well integrated and easily attainable via Creative Cloud. Everyone is on the same version and it’s easy to update when our team members are working all over the world.

Adobe: How is it possible for Stargate to expand when other companies may be having difficulties?

Ealovega: We face a wide range of both creative and financial challenges and are always looking for flexibility and economy. We’re long time After Effects users—we began using it even before Adobe acquired the software from CoSA—and it is the lynchpin of our operations. We use Adobe Photoshop for all of our matte paintings. We're not an editing house per se, but Adobe Premiere Pro is our tool for ingesting everything, in pretty much any format. We’re also starting to use Adobe Prelude for reviewing and Adobe SpeedGrade for color grading VFX. With Creative Cloud, we can work with artists from all over the world and streamline our creative and production processes so we can pass the savings and creative options along to our clients.


The Walking Dead Season 2 Visual Effects Reel

Adobe: How does your process work?

Ealovega: We essentially started working in the cloud before Adobe Creative Cloud was a reality. Our clients present us with the rough edit on a show online. Then we gather the source EDL files and match them to the original camera source to be sure we’re on target.

We match up what we’re finding in Premiere Pro with what we’re seeing in After Effects and ingest it into our Stargate system that is capable of sharing files with each of our facilities. We transfer all the information into our VFX system and the assets then get transferred to the local digital asset management system on the site that’s designated to working on the project. Keep in mind that this is all transparent to the VFX and editorial departments. So essentially, any artist, staff, or supervisor in any of our global facilities can review material in real time, create proxies, and send a project to a render farm in the background.

Using our proprietary software, we can review a VFX spot, make assignments, and build projects in After Effects. The composited assets automatically land in the final facility, in the right color space specified by the client. The artist doesn't even have to know what color space is needed. It’s applied automatically.

Adobe: Why is Creative Cloud central to your workflow?

Nicholson: We work on productions shot all over the world at different frame rates and on multiple cameras. I think we work with, literally, almost every camera format ever invented. We have to be able to ingest material from every camera manufacturer in the world at everything from 6K to PAL and NTSC resolution. Truly, we can’t use anything but Premiere Pro. It’s the only system that will handle the different formats we’re shooting and throwing on the timeline.

Ealovega: I have to add that Creative Cloud, and After Effects in particular, is an open framework. We can add functionality to the software through open source tools. This has allowed us to write our own virtual operating system (“VOS”) to assist us in integrating all of our facilities. It’s one thing to say that you’ve got one big facility with an integrated pipeline. It’s an entirely different thing to say you have seven or eight facilities and to be able to span each of these facilities with render data and have artists flowing their work back and forth to one another. Our artists can literally click a flag of the country on their screens and it sends an automated delivery to that render farm and ensures it is rendered correctly and delivered to the right destination. We also have incorporated other tools into Creative Cloud so that we can view dailies very quickly, without spending a lot of per-seat licensing for codecs that would otherwise be required.


Stargate Studios

Adobe: What do you see as the vision for the future?

Nicholson: Frankly, we hope we are inventing it. We create about 10,000 shots a year—that’s 40 to 50 shots per day from simple driving comps to 3D interfaces involving our virtual backlot. Using our workflow, we’re really proud of the quality. We are shooting four to five cameras at 4K resolution simultaneously. So we’re talking about a 16K to 20K background, and every frame must be processed. We break the frames up on the back end to be as light as possible. Then we use After Effects as a front-end tool to proxy the footage, and stitch it all back together.

Think of it as real-time compositing on green screen. It reminds me of sound mixing. You wouldn’t mix one or two cuts like you do two or three scenes in a movie. Adobe products and our own innovation are leading us into a visual mixing arena where we are talking about time rather than an individual cut. It’s all about the automation and efficiency of the tools, so that artists can focus on the genuinely complex shots. We can increase our creative capabilities and quality at the same time. The world has changed so that feature-film quality is being achieved on television series, and that efficiency has reached new heights. Creative Cloud and our virtual operating system let us accomplish two times the shots most companies can achieve—all at half the budget. We think it’s a model that will succeed in a world where automation is vital yet the human, creative touch is essential.

Watch the video:http://tv.adobe.com/watch/customer-stories-video-film-and-audio/stargate-st...

Learn more about the video apps and services in Adobe Creative Cloud


Download a free trial of Adobe Creative Cloud




Posted by: Adam Spiel on Nov 8, 2013 at 8:56:59 am After Effects, Customers

A bird’s-eye view of World War II

Emmy-winning documentary created for HISTORY includes 300 animations and 79 VFX shots created with Adobe video tools

When a television program wins an Emmy award for Outstanding Graphic Design and Art Direction, it must be something special. World War II from Space, a program commissioned by HISTORY, is a stunning 90-minute documentary visualizing key events from World War II from the vantage point of space. This was a huge creative endeavor with 300 animations and 79 VFX shots—all completed over the course of one and a half years by U.K. production company October Films and visual effects studio Prime Focus. Simon George wrote and directed the film, while Prime Focus created the VFX, led by Design and Animation Director Hazel Baird, and Creative Director Simon Clarke. We had a chance to sit down with George, Baird, and Clarke to discuss how the video tools in Adobe Creative Cloud, allowed them to create their own bird's eye view.



Adobe: How was World War II from Space different from other projects for Prime Focus?
Clarke: It was the first time that a show like this was created only using visual effects. The only live action is the interviews. Prime Focus engaged with October Films to co-direct, in a way. We wanted to make learning about World War II much more appealing to younger audiences, more exciting than black and white film images. We asked ourselves how we could get a new generation interested, while still layering in the amazing information we had from our faithful historians. Ultimately, we created 78 minutes of pure CG content for a 90 minute program.

Adobe: Can you tell us a little more about what makes World War II from Space special?
George: We were recounting the battles and shifting tides of the war from a bird’s eye view, so there was enormous reliance on animations and VFX to tell the story. We had to rely on globe-spanning maps and highly detailed computer animations to recreate events from Pearl Harbor to the atomic bomb. We wanted to create a new style that would be relevant for all ages and that would be visceral, informative and visually stunning.


Pop Culture Lens

Adobe: With so many VFX shots and animations over 90 minutes, what were your biggest personal challenges as a filmmaker?
George: I really liked the idea of telling the story from space, because it was such a grand concept that had never been attempted before but so many things in World War II were occurring simultaneously, and each influenced the other. There was also massive global geopolitical wrangling and the only way that can be captured in a visually stunning and meaningful way is from space. But I have to say as I started, I didn’t know if it was a good idea. Nine months into storyboarding, drawing, and designing, I still had some doubt: could we make it work? As a filmmaker, it is difficult to think of an idea that is so enormous. When you are shooting a drama, you have finite resources. Yet when you are building everything in CGI, anything is possible. Our biggest challenge was how to rein the show in, yet make it exciting.

Adobe: How did you stay organized with so much going on?
George: The hub we used to stay in sync was a combination of Adobe Story and Adobe Premiere Pro. The last thing I wanted was a bunch of different drafts of the script flying around, so I used Adobe Story for writing. Once I made changes, they were instantly reflected online so the producers, editors, and designers could see where my head was at in an instant. Then I had all the interviews with people like Pulitzer prize-winning historian David Kennedy transcribed and brought into Adobe Story. Everything could be shared instantly. We married the transcoded text with images and grabbed bits we needed using metadata and put them onto the Premiere Pro timeline.


Map Lens

Adobe: How did you actually capture the interviews with all of these top historians and military brass?
George: We had to be pretty methodical to keep everything organized. We flew to The Pentagon, San Francisco, and to a variety of places to interview leading historians. We shot the interviews on ARRI ALEXA cameras in ProRes, so the project was pretty large. Then we came back and got the files transcribed. We imported clips into Premiere Pro and put clip numbers on the text in Adobe Story. So rather than doing cuts on paper, we cut them together in Premiere Pro on screen, live. The integration between Adobe tools saved a massive amount of time. We could even type in a word and locate that particular text and put a clip in, literally saving days. That’s the challenge with documentaries: there is so much material to keep organized. We had the concept down, but we started seeing how it would really come together once we started cutting everything together in Premiere Pro. We would even block things out as rough storyboards, such as the Pearl Harbor attack, scan them into Premiere Pro to see how it was working and get a good idea of how to move forward.

Adobe: Tell us more about the actual design process?
Baird: I started off creating mood boards to show the different design approaches we could do. Once I had an understanding of where to go I started designing the look of the different styles (lenses). There were about six, Tech lens (UI Graphics), Pop Culture, Graphic lens etc. But because of the varying lenses I was worried that it could look a bit mismatched so I came up with a grid that is present in all the animations that ties them together. I used After Effects to create my storyboard images. I prefer using After Effects so I already understand how the elements are going to move. I have worked like that for years and like that approach.


Tech Lens

Adobe: What were you responsible for, working with the team at Prime Focus?
Baird: I was responsible for all the animations. For example, the six styles (lenses) were created in After Effects. Our team at Prime Focus would get the written notes from Simon George detailing what needs to happen in that scene (battle scenes, maps, etc.) and then we would use one of the styles to bring it to life visually in After Effects. If there were mock ups of old documents we would create these in Photoshop and then bring it into After Effects to animate.

We worked a lot of the more complex sequences out using a combination of CINEMA 4D and After Effects. The integration between these two programs is fantastic for speeding up the workflow. Every time there was a text shot, we would design it in 3D. We also did a lot of the post-production in After Effects to simulate atmospheric conditions and made a lot of particle fields.

Adobe: What were some of your favorite scene elements?
Baird: I loved doing the title sequence, as well as the scenes where I had to mock up documents to look like blueprints. I researched 1940s documents and maps from the War Museum in London to get a sense of the style back then. We often dirtied the images in Photoshop and then brought them into After Effects. The tech lenses looked great too; it really helped bring certain scenes to life. The photographs where we used parallax to give depth came out brilliantly and we used a combination of After Effects and Fusion (VFX) to create this effect. We were proud of how they looked on screen.


Paralexed Lens

Adobe: When did you start believing that the whole concept of using an orbital view of events would work?
George: We had the design down but also had the first six minutes of the film working, and we recognized that the look was exciting and visually stunning. We showed it to HISTORY and they were very excited with the results. But then when they wanted to roll with it, we started to feel a lot of pressure to move quickly, so we had to stay focused and work long hours. Every time something was finished, we put it back into Premiere Pro, so we didn’t have miscellaneous files disseminated all over the place.

Adobe: How was the film finished?
Baird: The initial coloring we did for most of the animations in After Effects didn't change much. We then brought all the shots into Premiere Pro and made any last tweaks. After that, we did a three-day color grade in Baselight to balance everything out and create a slightly darker grown up feel. We finished 80 shots that required a ton of motion tracking in eight months. We had a lot of image volume to work through. It was a little grueling, with long days and late nights, but it was also very rewarding.


Real Globe Lens

Adobe: Would you do this again?
George: This project has been quite a learning journey, in terms of telling a huge story in a way that brings all the visual conventions and imagery of modern war reporting to World War II. But I think it really gave viewers a fresh interpretation of the war. Telling almost everything graphically is so freeing and exciting. Yes, I would do a project like this again. The process was fascinating and we’re happy with the end result.

Clarke: As a result of the success of World War II from Space, we’ve been talking about creating other types of shows like this with October Films. It has opened the eyes of other commissioners and broadcasters. These types of projects have always had a very stereotyped format, and now there’s an appetite to try to push the visual aesthetic to a whole new level. There is a whole new generation to expose to the history of their fathers and grandfathers. From a social point of view, it's quite important. We’re translating the legacy to a new audience in a visceral and powerful way that enables them to engage with it. It is important for us to keep pace with the audience’s level of expectation, so the stories can survive and be passed on to a new generation.

Learn more about the video apps and services in Adobe Creative Cloud


Download a free trial of Adobe Creative Cloud



Posted by: Adam Spiel on Oct 29, 2013 at 1:36:14 pm After Effects, Customers



Find out what the movers and shakers in Adobe's Digital Video & Audio Organization are thinking about, and get a glimpse into their vision on everything from product direction to hot trends in the worlds of video production and content creation, as well as see how other filmmakers are using Adobe products to realize their creative visions.
© 2019 CreativeCOW.net All Rights Reserved
[TOP]