Interview: A Composer Breaks Down Creating 'Cosmic' Soundscapes for Sci‑Fi Shows
Inside a composer interview: how orchestration, trombone and electroacoustic tools create vast sci‑fi soundscapes—practical tips and 2026 trends.
Struggling to separate sensational space-music hype from real scoring craft? Here’s a listener-friendly interview that delivers.
Podcasts and streaming shows promise cinematic space travel — but who actually makes that sonic gravity? In this in-depth conversation (edited for clarity) with contemporary composer Anaïs Moreau, we break down how modern film and TV scores make orchestras and electroacoustic elements sound vast, immersive and otherworldly. If you’re a fan of sci‑fi soundtracks, a podcast listener who wants to decode audio wizardry, or a creator building your own sonic universe, this piece gives practical steps, 2026 trends and tools to use right now.
Why this matters in 2026
Streaming platforms and franchise renewals (see the continuing Star Wars reshuffle under Dave Filoni in early 2026) have pushed studios to demand more immersive scores and modular soundscapes. Dolby Atmos for music and film is standard for premium releases; live-to-picture concerts combining orchestra and electronics became routine in 2025; and hybrid scoring pipelines — human players augmented by spectral synthesis and AI-assisted orchestration — are mainstream. That evolution changes both the creative approach and the technician’s checklist. Below, Anaïs explains how she designs 'cosmic' textures for sci‑fi shows and what that means for composers, mixers, and curious listeners.
About the composer
Anaïs Moreau is a composer and occasional soloist based in Berlin, scoring television dramas, indie sci‑fi features and immersive audio experiences for museums and VR. She studied orchestration at the Conservatoire and has worked on three mid‑budget sci‑fi series in the past four years. Her recent projects blend chamber orchestra, modular synthesis and live-processed brass — a lineage that traces back to contemporary works like Dai Fujikura’s Vase Ocean II and trombone performances by advocates such as Peter Moore.
Interview: Techniques for making orchestras sound cosmic
Q: What’s the first decision you make when scoring a sci‑fi scene?
Anaïs: I start with function, not texture. Ask: is this moment about scale, mystery, threat or wonder? That answer guides orchestration, tempo and spectrum. For scale I lean low and sustained — contrabass, low brass, bass trombone, contrabassoon — often combined with electronic drones. For mystery, high, thin textures and microtonal clusters work better. Knowing the emotional goal saves you from piling on sounds indiscriminately.
Q: How do you get an orchestra to feel 'otherworldly' without losing clarity?
Anaïs: Use contrast and space. Here are the core strategies I use:
- Layered registers: Pair very low sustains with high, breathy winds or harmonics. The human ear ties them to a larger object.
- Extended techniques: Bowed tam‑tams, sul ponticello strings, air tones, and muted brass — these yield timbres that sound unfamiliar yet acoustic.
- Sparse scoring: Fewer players playing more unusual techniques wins over full tutti for 'alien' textures.
- Space as instrument: Add long convolution reverbs using impulse responses from large real-world spaces — ship hulls, cathedrals, quarry chambers — to suggest scale without muddying the midrange.
Q: The trombone keeps popping up as a star in modern contemporary works. Why is it effective for sci‑fi?
Anaïs: The trombone has an incredible dynamic and timbral range: from intimate, vocal-like solos to cavernous low brass roars. Recent concert pieces (Dai Fujikura’s work being a high-profile example) and champions like Peter Moore have broadened composers’ imaginations about what the trombone can do. For sci‑fi I use the trombone in three roles:
- Vocal soloist: Use mutes and multiphonics for eerie, human-adjacent lines. A close mic with subtle reverb makes it feel intimate within a cosmic context.
- Low-edge anchor: Bass trombone or trombone section doubled with synth subharmonics creates tectonic mass.
- Textural object: Processed trombone through granular delays and spectral transformers — you keep expressivity but render it unidentifiable.
Electroacoustic integration: bridging analog warmth and digital scope
Hybrid scoring is the norm in 2026. Anaïs outlines a repeatable workflow for integrating acoustic orchestras with electronic elements:
1. Sound first, arrangement second
Start by designing the sonic world with modular synths, found sounds and processed instrument samples. Think of these as the ecosystem your orchestra will inhabit, rather than just accompaniment.
2. Capture acoustic nuance
Record with intent: stereo pairs for ambience, spot mics for personality, and direct DI or contact mics when you plan heavy processing. Preserve dynamics and breath — they sell realism once you warp the signal.
3. Use spectral processing
Spectral tools (harmonic exciter, spectral blur, FFT morphing) are particularly potent for turning familiar orchestral tones into alien timbres while maintaining pitch information. In 2025-26, tools that allow real-time spectral morphing between a trombone and a granular pad have matured; they’re essential for live scoring workflows.
4. Choose the right convolution spaces
Impulse responses from unusual spaces (decommissioned observatories, concrete tunnels, airplane fuselages) give instant scale. Anaïs: "I keep a small library of IRs from field recording trips; some of my best ‘cosmic’ reverbs come from a lapsed radar dome."
Production techniques that actually translate on streaming platforms
Getting a mix to translate over earbuds, TV, and Atmos systems is harder than ever. Here are actionable rules Anaïs follows:
- Check in mono: Any stereo or spatial trick must still read in mono. This prevents phase cancellation that kills low-end impact on TV systems.
- High-pass with purpose: Roll off below 30–40Hz unless you want subsonic rumble. On smaller speakers that rumble becomes mush.
- Reference widely: Listen on consumer earbuds, a stereo monitor pair, a 5.1/Atmos room and a laptop. Streaming platforms compress aggressively, so loud, wide mixes can lose detail.
- Automate dynamics: Use gentle multiband compression and automation to keep the 'cosmic' elements from overpowering dialog in TV mixes. The new 2026 streaming loudness targets emphasize consistent perceived loudness across episodes.
Scoring techniques that create a sense of scale
Practically, these are patterns you can apply when you want to emphasize scale.
Harmonic anchoring
Build a low drone that implies a root pitch. Place intermittent high-register harmonics above it. The brain fills in the intervening space — a trick used since orchestral pioneers but refined with modern synthesis.
Orchestral clusters and microtonal drift
Unison followed by microtonal detuning (strings or brass slightly off-centre) creates an unstable, expanding sound. Anaïs uses slow pitch-drift LFOs on synth layers to subtly detune sampled choirs and brass.
Temporal stretch
Take a recorded acoustic hit and time-stretch it into a pad using granular synthesis. It keeps the instrumental color but becomes a sustained, weathered field.
Working with directors, showrunners and producers in 2026
Anaïs says collaboration is more iterative than ever because of quick streaming schedules and the demand for immersive sound. Her practical advice:
- Deliver sonic references: Provide short reference mixes (30–60s) demonstrating scale, not just temp music. Visual temp is not enough.
- Make choices early: Pick an anchor instrument early (e.g., trombone motif) so post can build consistent spatial treatments.
- Render stems for different platforms: Provide stereo, 5.1 and Atmos-ready stems. It saves time in final delivery and avoids last-minute compromises.
On AI, sample libraries and ethical production
2025–26 saw explosive growth in AI tools that can synthesize or emulate instruments. Anaïs acknowledges their utility but warns against overreliance.
"AI is a force multiplier for ideation, not a substitute for expressive performance. Use it to sketch and expand ideas, but keep human nuance — especially with instruments like trombone — at the core."
Actionable guidance for ethical use:
- Always check sample library licenses and vocal/performer rights.
- Credit session players and obtain permission for any AI recreations using a performer’s signature sound.
- Use AI to generate motifs and textures, then humanize them through performance or detailed sample editing.
Case study: Turning a chamber trombone line into a cosmic motif (step-by-step)
This is a compact, repeatable recipe Anaïs uses when the trombone is the seed for a larger 'cosmic' bed.
- Record a dry trombone phrase with close and ambient mics, plus a contact mic on the bell.
- Duplicate the dry track: keep one pure and send others through processing chains (granular, spectral morph, chorus + long convolution).
- Dup the pure track an octave lower via pitch-shifting, but blend at low volume to add weight without obvious doubling.
- Layer a synth sub (sine or wavetable) locked to the trombone’s root to reinforce low end on small speakers.
- Apply a long, slightly modulated reverb on aux — choose an IR that suggests scale (hangar, ship bay), then automate wetness to bring in the vastness only when needed.
- Add motion with slow LFO panning and stereo delays synced to tempo to avoid smearing rhythmic detail.
Tools Anaïs recommends (2026 edition)
- Sample libraries: High‑resolution brass libraries with multiple mic positions (do not cheap out on mic perspective).
- Spectral processors: Real‑time FFT morphers and spectral blur plugins (2025–26 generations are less artifact-prone).
- Granular engines: For long stretched beds and evolving pads.
- Convolution reverb with user IRs: Capture your own spaces whenever possible.
- Spatial mixers: Tools that let you author mixes for Atmos and headphone binaural quickly.
Common pitfalls and how to avoid them
- Overprocessing performers: If you strip away too many expressivity cues, the result will feel synthetic. Keep a dry path for authenticity.
- Cluttered midrange: Clean low‑mid EQ and selective masking help maintain clarity between electronics and orchestra.
- Ignoring delivery formats: Always test how spatial effects translate when downmixed to stereo or mono.
Audience takeaways — practical checklist
Use this checklist whether you’re scoring, producing a podcast about music, or simply curious:
- Define the emotional function first (scale, threat, wonder).
- Choose one acoustic instrument as an anchor motif (trombone is great for depth).
- Record multiple mic positions and preserve dynamics.
- Layer spectral and granular processed copies, keeping at least one dry path.
- Use convolution reverb with unusual IRs for instant scale.
- Mix for multiple endpoints: earbuds, TV, Atmos — and check mono.
- Use AI ethically: ideate with it, perform with humans.
Future predictions: What to watch in 2026 and beyond
Based on her work and industry trends, Anaïs predicts:
- Hybrid live events will expand: More live-to-picture shows blending orchestra and immersive electronics.
- Atmos and spatial-first composition: Composers will increasingly think in 3D from day one, not as an afterthought.
- Ethical frameworks for AI: Contracts and credits for AI-used timbres will become standard in major productions.
Where to hear Anaïs’ work and learn more
Listen to the full interview episode (podcast/video) for audio demos: isolated trombone chains, before/after mixes and a walk-through of a recent scene. We’ve included time stamps in the episode notes for quick A/B comparisons.
Final thoughts
Creating cosmic soundscapes is equal parts orchestration, recording craft and modern signal design. The tools available in 2026 — from improved spectral processors to spatial mixing platforms — make it easier to imagine and realize vast sonic worlds. But the human core remains essential. As Anaïs puts it:
"The more alien your textures, the more you need a human center to anchor the audience emotionally. That center can be a single trombone line, a breath, a gesture — keep it honest."
Actionable next steps
- Download the episode notes and the 1‑page "Cosmic Scoring Checklist" linked in the episode description.
- Try the trombone-to-pad recipe above with a short 8‑bar phrase and A/B into a dry acoustic version.
- Experiment with a custom IR: record inside a large room or tunnel and use it on a string sustain.
Call to action
Want the raw stems and processing chains from this interview to try at home? Subscribe to thegalaxy.pro podcast feed, download the episode resources, and join our online forum for composers and fans where we hold monthly score deep-dives. If you create a piece using these techniques, share a short clip with #CosmicScore — we’ll feature standout work in an upcoming episode.
Related Reading
- Monetize Lyrics & Fan Content Like Goalhanger: Subscription Tactics for Music Creators
- VistaPrint Hacks: How to Get the Biggest Savings on Business Cards, Brochures and Invitations
- Must‑Buy Star Wars Luxury Collectibles — A Curated Investment List
- Casting Is Dead, Long Live Casting: The Future of Second‑Screen Control in Home Cinema
- Integrating Multi-Provider LLMs: Lessons From the Siri-Gemini Partnership
Related Topics
Unknown
Contributor
Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.
Up Next
More stories handpicked for you
Bollywood and Beyond: The Cosmic Narrative of Shah Rukh Khan's 'King'
Kickoff in the Cosmos: Indiana Football and Its Cosmic Impact
Drama Beyond Earth: The Tension Between Space Exploration and Global Politics
Space Medicine: What Sci-Fi Gets Wrong About Care in the Cosmos
Lost in Space: What Sci-Fi Can Teach Us About Real Life Space Missions
From Our Network
Trending stories across our publication group