DreamDirector.AI is a dream tech app for Apple Watch & iPhone users, designed to create an accessible and effective dream directing tool. The Beta MVP app uses machine learning to detect REM sleep in real-time. Planned AI integrations include using generative AI to visualize dream reports and craft custom Dream Scripts. What experiences would you choose, if you could direct your own dreams?
Your sleep cycle passes through four phases over a 90-minute period which repeats through-out the night. The first three phases are known as slow wave sleep (SWS) phases N1, N2 and N3, and the last phase is known as REM sleep. Dreaming occurs during REM sleep. REM sleep can be detected with appropriate sensors and DreamDirector.AI's proprietary REM sleep detection model.
One way to influence a dream is to play back external cues (light flash, sound clip, vibration, etc.) during the dream. These cues can get incorporated into the dream, and bias the dream in the desired direction. This allows for dream directing: the purposeful shaping of dream content using external cues which act as a cognitive support for the dreamer.
It is also possible to use external cues to remind the user that they are dreaming. This is known as lucid dream induction. A lucid dream is a dream where the dreamer knows that they are dreaming and they are able to influence the dream, e.g., fly around, pass through walls, etc.
Dream scripts are customizable programs within the DreamDirector.AI app that playback during REM sleep to bias the dream in the desired direction. Pre-made dream scripts can be selected by the user or created using the editor. We are also developing an AI tool for generating novel dream scripts based on user-prompts.
Dream scripts are comprised of stimuli (light flashes, sound, vibration, etc.) to cue the individual within their dream. For example, a user could choose (or create) a dream script for a tropical vacation. The intention may be to decompress and truly relax during the dream.
Before going to sleep, users are encouraged to preview the dream script. Once they are asleep and in the desired REM window (usually the last REM period of the night, before they wake-up), DreamDirector.AI will initiate dream script playback.
Dream scripts not only allow users to more consciously influence their dream content, they also create a new opportunity for group dreaming, where a group of people all use the same dream script. Would you try shared dreaming?
Unlike most sleep apps that only give a snapshot of the last night of sleep in the morning, DreamDirector.AI is able to detect REM sleep in real-time using a machine learning model. When this happens within the desired wake-up window (this is a smart alarm, after all), the dream script cues start to play back. These cues are incorporated into the dream, and start to bias the existing dream in the desired direction. These cues act as a cognitive support for the dreamer, reminding them of their dream intention, while also acting as stimuli to influence dream content directly.
DreamDirector.AI uses voice-to-text to allow for easy dream capture. We are developing generative AI solutions to allow us to visualize the user's dream as images and short form video, enhancing dream recall while also making dream sharing more fun.
The Alpha App did not yet have REM sleep targeting. The Beta MVP app does have REM targeting using AI.
An editor is used to write the dream script, including audio prompts.
The user sets the wake-up window and previews the Dream Script before going to sleep.
The app waits until the user-defined wake-up window and then detects for REM sleep
During REM sleep, the cues playback, get incorporated into the dream, and bias the dream in the desired direction.
The smart alarm wakes the user and a voice-to-text dream report is captured. The dream report can later be visualized using generative AI.