Monday, August 03, 2020

Digital humans that look just like us | Doug Roble

Sunday, August 02, 2020

One negative review of xsplit.

Frustrated doesn't even come close to what I'm feeling. I'm using OBS and X-split to make my videos and X-Split is giving me fits. I'm pretty done with it! The audio and video continue to be out of sync even though I've micro adjusted the settings, gotten a new hard drive, high speed internet and now use an ethernet cord. WHO KNOWS WHAT OTHER SOFTWARE I CAN USE BESIDES XSPLIT??? I have my debit card ready, lol.
*Note unless you know what I'm talking about, it's ok, please  don't worry about it. 
**And NO, I'm not messing with Xsplit anymore right now. 
😬😖😩


Friday, July 31, 2020

Projection Tool - poor mans laser cutter.

PolyProjector

This is a tool for projecting the individual shapes of a 3d model onto sheets of cardboard to create physical models. It's kind of like a poor mans laser cutter. Its build on top of the game engine Unity.




Source code:  https://github.com/greengiant83/PolyProjector

You can download a precompiled version of the app here: Windows 10: https://github.com/greengiant83/PolyProjector/raw/master/App%20Builds/PolyProjector%20-%20Windows.zip


After calibrating the screen. Press L on your keyboard to load the 3d model. Use < and > (command and period) to cycle through the faces of your model. Mouse wheel can be used to rotated the polygon for easier working. Some more information can be found in this video:
https://www.youtube.com/watch?v=iDshWkUkbXk



The cool think here is it was made by Matt Bell, a Virtual Reality expert who wanted to make physical objects,
Sort of the reverse of my path of having the digital manufacturing down but wanting to put things in to VR now.




ImmersionVR has a great article about VR video and VR360

https://immersionvr.co.uk/about-360vr/vr-video/

What is VR video?

VR video and 360 VR are essentially interchangeable terms. They refer to videos that are captured using specialist omnidirectional cameras which enable the filming of an entire 360 degrees at the same time.
In the finished video the user is free to look around the entire scene. In contrast to regular videos, VR videos provide an immersive, interactive experience. The user often experiences the feeling of actually “being there”.
Contrary to popular belief, you do not need a special device in order to view 360 videos. They can be viewed on the vast majority of devices, including mobile.
The user can swipe or scroll across the screen in order to view the entire 360 degrees of the scene. This can be particularly useful for commercial applications, such as virtual reality real estate tours.



DepthKit - Depth Image Video - Free Viewpoint video. volumetric video



https://www.depthkit.tv/
DepthKit AFrame DepthKit for AFrame

An A-Frame component for rendering Volumetric videos captured using DepthKit (i.e Kinect + DSLR) in WebVR. The component wraps DepthKit.js which provides a similar interface for Three.js projects.


https://github.com/juniorxsound/DepthKit-A-Frame

https://orfleisher.com/aframe

https://github.com/juniorxsound/DepthKit-for-Max


DepthKit for Max/Msp/Jitter

A sample Max patch demonstrating a workflow for playing volumetric video in Max/Msp/Jitter using DepthKit combined-per-pixel exports. Supports rendering a mesh, wireframe and points.
DepthKit in Max


Thursday, July 30, 2020

Seurat : system for image-based scene simplification for VR

https://developers.google.com/vr/discover/seurat



Seurat

Seurat is a scene simplification technology designed to process very complex 3D scenes into a representation that renders efficiently on mobile 6DoF VR systems.
Seurat works by taking advantage of the fact that VR scenes are typically viewed from within a limited viewing region (the box on the left below), and leverages this to optimize the geometry and textures in your scene.
It takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate, to simplify scenes beyond what traditional methods can achieve.
Seurat is available as an open source project on GitHub, and includes plugin support for generating depth images for scenes in both Unity and Unreal.

Seurat - Documentation

What is Seurat?

Seurat is a system for image-based scene simplification for VR. It converts complex 3D scenes with millions of triangles, including complex lighting and shading effects, into just tens of thousands of triangles that can be rendered very efficiently on 6DOF devices with little loss in visual quality. It delivers high fidelity graphics on mobile VR devices. (One way to think of it is as a serving the same role as stereo panoramas on 3DoF VR devices, on 6DoF devices.)
The processing pipeline for static environments generates data for a single headbox (e.g. 1 m³ of space). Input data can be generated with any rendering system, e.g. a real-time game engine or an offline ray tracer. We have plugins for Unity, Unreal and Maya. Seurat outputs a mesh with an RGBA texture atlas, which can be rendered in any real-time engine. Dynamic content can be composited on top of the static Seurat environments.



Optimizing for 6DOF mobile VR with Google's Seurat









I lifted this from a 3D photo off facebook.  Still trying to make sense of it...
You can see this image have been broken in to subimages, I suspect these are mapped on to various planes with different Z buffer levels, something that renders quickly in webgl.



VR180 Test on Mobfish...




https://vr180test.mobfish.studio/webvr


Testing the Mobfish free trial of VRStudio.

https://mobfish.net/features/cloud-transcoding/






https://johnsokol.mobfish.studio/webvr

OMG, there are no more WebVR browsers, it was discontinued about 2 years ago everything is now WebXR.

Well it play fine on a Desktop.


Wednesday, July 29, 2020

How The 'Anti-Paparazzi' Scarf That Ruins Photos Really Works



Retroreflective fabric, they sell for chromakey backdrops.

With a fabric silk screen print on it. 


Video Analytics.

A JavaScript SDK for tracking events and revenue to Amplitude.

https://github.com/amplitude/Amplitude-JavaScript

https://developers.amplitude.com/docs

How Amplitude Works

To understand how Amplitude works, let’s walk through a hypothetical example.
Tunes is a standard music player for mobile devices that has common actions like playing a song, skipping a song, shuffling play, and sharing a song.
Using Amplitude, you can track all the actions your users make in detail and better understand what’s working and what’s not.

What actions will Amplitude keep track of?

Amplitude gives you the power to determine what’s important to your experience. You can choose to track anything and everything.
For example, in Tunes, you could track the music control buttons the users presses or even how many songs each user has listened to in each session.



Tuesday, July 28, 2020

AR Augmented Reality PCB board inspector.

https://www.instagram.com/p/CCbGRfqDJBS/


inspectarteam

inspectAR uses the board files from @adskeagle and @kicadsolutions, or the ipc2581b files from @altiumdesign, @mentorpcb and many other EDA’s to create the AR overlays.
Save yourself hours of time flipping through datasheets, pinout diagrams and everything else that reduces your productivity. Eliminate the frustration of recounting pins for test points, or searching for specific traces.

#electronics #electrical #electricalengineering #engineering #engineers #engineer #engineeringlife #smd #tht #cad #hardware #firmware #pcb #pcba #pcbassembly #pcbdesign #pcbdesignengineer #debug #components #tech #technology #augmentedreality
2w