Hey everyone! Maya rendering is partially ready! As with the video transcoding service, this will be a free rendering platform until there is enough testing to ensure that it is working in a majority of cases. There will be three input file types possible: The initial capability is that a single frame can be rendered matching a “Arnold Scene Source” or .ass extension, which can be directly exported from Maya. After that, a zip file containing more than one .ass file will be accepted. This will be the first type Maya or Arnold render that will make use of the distributed nature of Hyperwave, with each capable node taking on a frame at a time until completed. Finally, a .mb extension, or what I consider a typical Maya file will be able to be sent with a optional parameter to indicate the frames that need to be rendered which are already defined in the sequence.
I’ll also start working on a Hypernet page so that there is visibility into the scale that stands behind processing each type of request. Initially, the pool is just my own personal resources. There is a client software which I’ll be releasing as well that grants credits per work unit processed. In the longer term, those credits will be usable to fund pattern executions with an additional payment method to top up or cover the gap in credits needed to process the job.
To test this process, I’ve used an animation that I built while attending a Master’s program at Academy of Art University. Here is the full video: https://www.youtube.com/watch?v=ZnqYVGfYDco&ab_channel=AndrewJonesGaming . Additionally, the specific frame I keep rendering for testing is using a nCloth simulation on a spiderweb, so it’s a very resource intensive render, taking 25 minutes for a single frame of 1920 x 1080. At 24 fps, a single second would then take 600 minutes for one second of video. Hyperwave is ready to split that up and attack it from a distributed computing angle!