Monday, April 27, 2020

Fwd: OctaneRender 2020 and RNDR are out now!



---------- Forwarded message ---------
From: OTOY Inc. <help@otoy.com>



OctaneRender 2020 and RNDR are out now!

OctaneRender 2020 and RNDR are out now!

OTOY® is very proud to launch OctaneRender 2020 - with huge 2-3x speed gains using Optix 7 RTX hardware acceleration, Native GPU C4D Noises and much more.
 

In tandem, OTOY is launching the RNDR NetworkTM today for all Octane 2020 users, ushering in a new era of decentralized production-ready GPU cloud rendering at scales never before possible.
 

What's New in OctaneRender 2020.1:
  • NVIDIA RTX GPU Hardware Acceleration with 2-3x speed increases, improved NVLink support and order of magnitude 15-30x accelerations in fully RTX optimized scenes
  • Fast Spectral Random Walk SSS Skin and Spectral Universal Hair Material for instant unbiased, GPU based sub-surface scattering
  • Native C4D GPU Noise Shaders for volumes, textures, displacement and shader graphs authored in Maxon Cinema 4D
     
  • Universal Camera with OSL Distortion Maps, Split Focus Diopter, Optical Vignetting,  Aperture Texture and In-Camera controls for advanced DOF, Aberration and Distortion
     
  • GPU Volume Displacement and Vectron Volumes bringing fully procedural Volumes to the Octane Node Graph along with new Vectron Mesh and Volume Operators
     
  • Octane Spectron Light Primitives with 4x faster Quad and Point Spectron Lights, Procedural Spread functions, and new Gobo Filters and Lights

Other New Features

  • LiveDB OSL Procedural Shaders, Native Curve and Point Primitive Attributes, Improved Rounded Edges, ACES and Advanced Color Management, Universal Dirt System, Volume Shadow Step Length controls, an improved Daylight System Sky Model, and a Hydra Render Delegate supporting the Houdini 18 Solaris viewport.
 

Concurrently with the OctaneRender 2020.1 launch, the RNDR Network is now available publicly to all users with an active Octane subscription or maintenance plan. The public launch of RNDR introduces artist-friendly RNDR credit purchasing options as well as a new Enterprise tier -  in partnership with Microsoft and other GPU cloud providers -  enabling secure MESA/TPN/MPA production rendering on the RNDR network.
 

Subscribe or upgrade to OctaneRender 2020 today and access RNDR at rndr.otoy.com. A preview of OctaneRender 2020.2 and the complete OctaneRender 2020 roadmap can be viewed here. Please read the release post for more details on OctaneRender2020.1 features.
The OTOY Team

www.otoy.com


"Captain Marvel" Main Title Sequence by Elastic. © 2020 Marvel. All rights reserved.

Copyright © 2020 OTOY Inc., All rights reserved.


Saturday, April 25, 2020

Fwd: [Newsletter] The final week of Bitmovin LIVE is upon us - learn about Live Low Latency Streaming and Data-Driven Workflows


---------- Forwarded message ---------
From: The Bitmovin Team <marketing@bitmovin.com>

bitmovin-live-data-driven-video-workflows-NL

Virtual NAB - Data Driven Workflows Webinar

The age of data and information may have reached peak attention in the early 2010s, but that doesn't mean that it's lost any steam in the past 10 years. OTT, VoD, and Broadcast services must rely on data to optimize their business for risk mitigation and ROI. Join us on Monday, Apr 27th @ 8am PT (11 am ET | 3 pm UTC), as we outline the type of workflows that depend on data to succeed and how you can use data to improve your workflows. Register below:

Sign up today

Blog - live-low-latency-streaming-NL (2)

Blogs

Video Tech Deep Dive: Live Low Latency Pt 1

According to our Video Developer Report 2019 - Low Latency streaming was the top challenge that developers are facing when building out their apps or services. This challenge multiplies when considering low latency for live streaming. Take a moment to learn about Live Low Latency, how it differs from "standard" low latency, and why it's important for the future of your business in our latest blog post:
View blog post
 

Bitmovin...A Great Place to Work!

Writer's message: As employees, we know that Bitmovin is a great place to work, especially given recent circumstances. All of our managers, directors, and executives have stressed the importance of staying safe, healthy, and maintaining a balanced work-life schedule in these trying times - so it was a great vindication to hear that earned the #2 spot in this year's "Great Places to Work" in all of Austria. Check out our blog post to learn a little more about what this means and how you can join our international team!

 

View blog post

bitmovin-live-nab-2020-edition (1)

bitmovin live: NAB edition

Event schedule: Week 3 (Apr 27th - 30th)


Tech Talk Follow the Data: Data-Driven Workflows Apr 27th Reserve spot
Tech Talk Achieving D2C Streaming Success Apr 27th Reserve spot
Tech Talk Low Latency Streaming Apr 28th Reserve spot
Tech Talk Encoding in AWS Cloud Environments ft Amazon Web Services Apr 28th Reserve spot
Partner Presentation Beenius: OTT as a Strategy Apr 29th Reserve spot
Learning Lab Advanced Player Course Apr 29th Reserve spot
Tech Talk State of Compression - The Video Codec Landscape 2020 Apr 30th Reserve spot

 

Job Openings

Open roles

Today, we're seeking a curious, inventive, and flexible individuals to support our US and Austrian teams. Check out the links below for more information:

  • Solutions Architect (Denver) - A Solutions Architect's responsibility is to foster our customers' technical success and overall happiness. Join our team if you have experience in software engineering/web development and interest in online video.
  • Engineering Team Lead (Klagenfurt) - As an engineering team lead your responsibility will be to run our Web Player team. The Player team works on our industry-leading adaptive streaming player that runs everywhere, from desktop browsers to mobile apps to OTT devices such as SmartTVs or gaming consoles.

San Francisco  •  Chicago  •  Denver  •  Seattle  •  New York  •  Klagenfurt  •  Vienna  •  London  •  Hong Kong  •  Sao Paulo

© Bitmovin, Inc. - 41 Drumm Street, San Francisco, CA, 94111, USA





Sunday, April 12, 2020

Multistream to 30+ Platforms Simultaneously | Restream

Next generation solar cells perform better when there's a camera around

Quantum camera snaps objects it cannot 'see'


A normal digital camera can take snaps of objects not directly visible to its lens, US researchers have shown. The “ghost imaging” technique could help satellites take snapshots through clouds or smoke.
Physicists have known for more than a decade that ghost imaging is possible. But, until now, experiments had only imaged the holes in stencil-like masks, which limited its potential applications.
Now Yanhua Shih of the University of Maryland, Baltimore, and colleagues at the US Army Research Laboratory, also in Maryland, have now taken the first ghost images of an opaque object – a toy soldier (see image, top right).

Quantum camera

Ghost imaging works a bit like taking a flash-lit photo of an object using a normal camera. There the image forms from photons that come out of the flash, bounce off an object and into the lens.
The new technique also uses a light source to illuminate an object. However, the image is not formed from light that hits the object and bounces back. Instead, the camera collects photons that do not hit the object, but are paired through a quantum effect with others that did.
In Shih’s experiments a toy soldier was placed 45 centimetres away from a light source, which was split into two beams. One was pointed at the toy and the other at a digital camera. A photon detector was placed near the soldier, able only to record when a photon bounced off.

Connected pairs

Photons from the light source constantly travel down both paths made by the splitter, either towards the soldier and the photon detector, or towards the camera. The detector and camera record a constant stream of those photons, and occasionally record a photon at exactly the same time.
When this happens, there is a direct relationship between where one of the photons hit the soldier, and where the other one hits the camera’s sensor, says Shih, because of a quantum effect called “two-photon interference”.
“If the first photon stops at one point on the object plane, the second photon can only be observed at the corresponding point on the image plane,” he says.
So when the camera records only pixels from photons that hit simultaneously with one reaching the detector, a “ghost image” of the object builds up. The soldier’s image appeared after around 1000 coincidental photons were recorded.

Sunny snaps

“It is clear that the experimental set-up can be directly applied to sensing applications,” Shih told New Scientist.
The same method could one day be employed to produce satellite images of objects hidden behind clouds or smoke, using the sun’s radiation as the photon source, says Shih. Doing that may require a photon counter beneath the clouds, but could allow a top-down view not possible using conventional methods.
Not everyone agrees that quantum effects are at work in ghost imaging, though. Baris Erkmen and Jeffrey Shapiro of the Massachusetts Institute of Technology, Cambridge, US, point out in a recent paper that classical physics says light sources produce numbers of uncoordinated photons that are not correlated as Shih suggests.
They suspect ghost images might be produced without a quantum link between photon pairs, purely because some photons are just similar.
Quantum World – Learn more about a weird world in our comprehensive special report.


https://www.newscientist.com/article/dn13825-quantum-camera-snaps-objects-it-cannot-see/