MA Lighting release a Quantum Shift in Lighting Control

soundlight

Well-Known Member
MA Lighting release a Quantum Shift in Lighting Control

MA Lighting today announced that after six years of research, development and prototyping, their new user thought controlled lighting interface is ready for release and demonstrations. The new product has the working name ‘Cranial Replay for Artists and Performance’.

proxy.php
proxy.php

Developed with the Massachusetts Institute of Technology (MIT) department of Cranial Uniformity and Neurological Technology over a period of four years, MA Lighting’s R&D department worked tightly with the department’s Professor Helmut Krown. Professor Krown has spent over ten years developing the concept with NOAA, NASA and the EPA.
Professor Krown said “We were tasked by MA Lighting to conceive and bring to reality a lighting control surface that would fully integrate, for the first time, the lighting professional and the technology as one. The use of thought control to activate cues, create mood, display feeling via the lights and optimize the control of video elements via the brain is a quantum shift in the way visual entertainment technology is controlled and considered.”
The lighting operator dons what looks like a normal MA Lighting cap which is hard wired to a belt pack that then communicates via wi-fi to the modified MA2 console. This wireless feature means that control can be done remotely from the console.
A water based lubricant is required on the scalp of the operator to ensure good electrical contact is observed from the operator to the cranial unit.
The integrated skull cap and belt pack features a revolutionary proprietary artificial intelligence (AI) interface that will ‘learn’ the biorhythms of the individual operator’s brain electrical activity. In setup mode, the operator is required to wear the unit for 12-14 hours so the unit can read, learn and record the user’s brain waves. This is a once only setup and multiple users can be preprogrammed to the solid state technology contained within.
Once setup is complete the interface is operable with thought control to the lighting and video elements. Lamp focus, colour, gobo and beam information can be selected by a gentle thought process. Whole cue lists, macros and multiple other show features can be designed, edited and played back by the operators thought.
“I am very much looking forward to showing this to our customers,” commented Mike Gearin of Show Technology, MA Lighting’s Australian distributor. “It’s taking virtual reality to reality, placing your mood and thoughts onto a picture on stage – it really is a great step in the development of lighting control.”
The system will be shown to a select few at Prolight+Sound in Frankfurt next week where it is sure to be the most talked about shift in entertainment technology since the VL1.
For more information please go to MA Lighting

Thanks to the Aussies at ALIA for this one: http://www.alia.com.au/category/equipment-news/
 
Not bad for an early April Fool.
As `out there`as this may seem, research along these lines began serious testing in a totally bleeding edge facility designed and constructed on the roof of a psychology building at Hamilton, Ontario`s McMaster University and opened only this past fall with a substantial influx of monetary support from Meyer Sound.
Yes. Research is actually being conducted on observers and performers alike with all of the participants wearing totally custom helmets summarily amounting to `tin foil hats` and linked directly into data systems via dedicated high speed fiber links. Again, I`m serious about this with several AES meetings having featured private tours of this facility. To date all reports leaking out are extremely positive but then why wouldn`t they be as the opening `buzz`of having a new toy to pleasure themselves and further their research has yet to wear off.
No, this is NOT another April fools`day prank.
Toodleoo!
Ron Hebbard.
 
thats going to be an awesome technology.
The following contains several links to what's happening with the research along with some of their methods and thoughts on where they're going.
It is called "McMaster Institute for Music and the Mind (MIMM). Here is a link to their home page: https://mimm.mcmaster.ca/
The 100+ seat concert hall with the extremely low background noise rating of NC 10 and the Meyer Constellation variable acoustic system is called LIVELab: http://livelab.mcmaster.ca/
This link will give you some idea of what they are doing:
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
. This link describes the technology: http://livelab.mcmaster.ca/research/technology/
@Moose Hatrack and @BillConnerFASTC , you MAY find this of interest if you've the time and your sound on.
Toodleoo!
Ron Hebbard.
 
Dude...
Imagine just passing the head thing to the director when they're having a hard time explaining what they want.
 
Dude...
Imagine just passing the head thing to the director when they're having a hard time explaining what they want.
In the case of MIMM (McMaster Institute for Music and the Mind)'s work, they're actually simultaneously monitoring and recording the brain waves of not only the performers but the audience members as well. They're combining motion capture technology then processing and graphically projecting the data as they're capturing it in real time and / or seconds later. Data collection is via fiber and the amount of processing power they're using simultaneously is phenomenal. A long-time friend and former associate has invited The Toronto AES chapter into the facility on at least one occasion. The one link I pointed out earlier in this thread is pretty much the briefest overview of the work without requiring a lot of your time or going too deeply into any of MIMM's efforts.
Toodleoo!
Ron Hebbard.
BTW "Dude", I'm Ron to one and all.
 

Users who are viewing this thread

Back