how about a camera on an arm so you can place the camera over the face of the person you are talking to. It would block part of their face, but at least your conversation would be more transparent.
Of possible interest, Errol Morris’ “Interrotron” camera setup:
https://www.fastcodesign.com/1663105/errol-morriss-secret-weapon-for-unsettling-interviews-the-interrotron
I have a USB camera for my desktop, so yes, I can already do this, but blocking the face of the person you are talking to is a high price to pay for being seen as making eye contact.
Otherwise known as a Teleprompter. At least it’s the exact same setup, only used to display the interviewer rather than the script text.
Anyway, placing a camera behind a screen is possible, but it’s difficult to do it in a way that it’s completely unnoticeable. There is no such thing as a true one-way mirror. If a camera can see you, then you can see the camera. No matter how small the camera lens is, the glint off the lens may be noticeable behind the screen in some situations.
[quote=“scr4, post:24, topic:811953”]
Otherwise known as a Teleprompter. At least it’s the exact same setup, only used to display the interviewer rather than the script text.
[quote]
The Interrotron is not just a Teleprompter, it incorporates a device like a Teleprompter; the critical thing is that there are two of them so each participant makes virtual eye contact with the other.
The point is not to hide the camera, but to position it so the person being recorded is making eye contact with the camera.
If you really want to work on it, you could get a larger monitor and a camera with a telephoto lens and sit further from the screen–the difference in eye angle would be smaller.
If the larger monitor is to keep the image at the same size in your field of view, then you’re just taking away the gains you made by putting it further away. The only reason putting the screen further away would work is that you’re decreasing the angular size of the screen.
My point is that a camera in the middle of the screen will be visible. Nobody wants a shiny little lens in the middle of their smartphone screen.
This just in: the new iOS 13 (still in beta) features FaceTime attention correction. Basically it alters your image so that your eyes seem to be looking straight at the camera. Details here.
I was wondering if iPhones already do something similar. I have noticed when taking selfies that if I look at my own face on the screen, I seem to be looking “at the camera” whereas if I look at the lens (at the end of the phone, when held in landscape orientation), it looks like I’m looking off to one side. This makes no sense, unless my eyes are really wonky.
Here is an Ars Technica article on this subject.
Here is a startup project that popped up on my Facebook feed. This is not embedded into the screen but is a poor-man’s version that just hangs down in the middle of your screen to get the same result.
Man, I remember reading the O.P. way back in the day and thinking at the time that this was an understandable, but rather specialized, problem.
Back then, it seemed trivial - a cosmetic issue, really, and nothing more.
Fast forward a year & change, and suddenly, EVERYone wants a solution to the problem. It ain’t trivial no more.
The ZTE Axon 20 5G, released in December 2020, was the first smart phone with an under display camera. The reviews of the camera seem distinctly meh.
An interesting and different approach to this problem is to use Machine Learning to reconstruct a frontal view of the speaker using input from a camera mounted to one side - there’s a demo of this here: Is Videoconferencing With Smart Glasses Possible? 👓 - YouTube
According to PR, this is the first invisible below-display camera. (The earlier tries had a much lower resolution in the cutout area, which was noticable during use.)