Is technology available to embed a camera into a screen?

I have video chats with my daughter using my desktop computer and it’s always goofy because my camera is at the top of the monitor but I have to look below it to see my daughter on the screen. So my image always shows me looking down, instead of making eye contact.

Is technology available to integrate a camera right into the screen but not obscure the pixels so you can have eye contact with video?

I have no idea even conceptually how that would work but then again I had no idea how touchscreens could work either until someone invented that.

Apple filed a patent for this almost a decade ago. Both Apple and Samsung appear to be working on this for the precise reason you mention. In order to view through the screen the device will either have to darken the section over the lens, or somehow phase the camera and displayed image out of sync.

Stranger

Not sure if this is what you’re asking for. Current mass production technology exists now. Believe it or not, it is as low tech as drilling a hole in the glass panel. To do so, however, is an expensive process, there is a lot of scrap, and therefore is much more expensive than putting the camera in the bezel area.

I use a tablet. It has 2 cameras. Which is kind of silly. The one on the back is ok, but of limited use. I mean, a person looks damn awkward and stupid framing up a picture with this big slab. Then, of course, the issue mentioned with the front camera. I mean, why not lay one long camera in a slot near the edge, on a gooseneck. That way, I could take pictures more conveniently (aim the camera itself instead of the whole tablet), turn it around for video chat, and the longer focal length might improve image quality. Sometimes it seems like these fancy engineers have no clue about ergonomics.

You could, in principle, just put the camera behind the screen with little redesign needed for some display types.

Most displays do not have 100% pixel coverage. You can see that from some of the imagery here. Although not all of the black space can be used (it’s used by active components like transistors), at least some fraction, perhaps up to ~50%, is available. That’s still a decent amount of light available for the camera. Even portions of the dead space containing wiring could be used with a transparent conductor (like indium tin oxide).

One would have to be careful to not allow the light from the display to propagate back to the camera, and there might be some diffraction effects from there being a bunch of blocked parts of the image. But these should be solvable problems. For a camera that’s just used for selfies, the quality decrease might be acceptable.

There’s another possibility here, too: have cameras at both the top and bottom, and synthesize an image from a virtual center camera using depth-from-stereo. A real depth sensor like on the iPhone X would help as well.

Modern phones probably have enough graphics horsepower to get away with this already. Maybe not perfectly, but in a few years they’ll be fast enough that you really won’t know the difference.

It must be possible to put a camera to the left of your screen and one to the right of your screen, and appear on the other person’s screen in 3D.

I’ll do some research.

I was involved in the manufacturing, selling and marketing of video related products for many years and this is one area which has been researched for decades. I haven’t been involved in the last five or six years, but it’s always been a few years away.

Maybe less than ideal, but you could try looking at the camera when you’re speaking and looking at the screen when she’s speaking. You could even put your favorite wallet size picture of her right above the camera.

If that doesn’t sound appealing, another solution would be to back up a bit further away from the screen/camera until the difference between you looking at the camera and you looking at the screen isn’t noticable.

There’s a phone now that has the fingerprint sensor (which is an imaging device) behind the OLED screen - so I imagine it could be done - OLED screens are naturally sort of transparent without an explcitly-added opaque backing layer.

Even if the things being displayed on the screen might interfere with the image capture, it should be possible to adjust for this (the device in theory knows exactly what each pixel is doing, so it ought to be possible to subtract the displayed image from the captured one.

I think Apple and Samsung are aware that a significant percentage of their clientèle are quite worried about being filmed while unaware, to the point of putting a sticker on top of their front-facing camera. There would be no convenient way to put a sticker on a hidden camera in the middle of the screen.

I get the paranoia. I know people (my mom, for one) who do that. But a tablet or phone still has a microphone, so how do you prevent malware from gaining access to that? It is pretty darn difficult to fully block a mic.

I read years ago they (maybe Cisco?) were working on technology that would artificially shift your irises into appearing to look at the camera using snapchat filter-like technology. Maybe that’s already out there and I just haven’t found it. I could imagine that being a little creepy if not done right, but it would solve the “looking down” problem, easier and less expensively than non-intrusively putting a camera in a screen with hardware.

My arm gets tired being awkwardly extended during video chat. I don’t really take selfies, but maybe if I had a selfie stick it would make my video chats more comfortable.

Or a pocket tripod. :slight_smile:

Regarding the actual question about a camera in the screen: I honestly thought before reading the OP that the question was going to be about privacy and surveillance. Maybe screen-embedded cameras have mostly stayed off the market not because they’re so hard to do (though maybe they are, I wouldn’t know), but because the idea is unpopular in some circles. Maybe the concept disturbs too many people.

It’s not entirely paranoia. There was a case a few years back where a school district provided their high schoolers with laptops, and then spied on them using the built-in camera without telling them. Which they insist is a power that they used only for good.

But iris direction is not the only cue. This would result in a view of me with my head bowed down and eyes pointing up. See first three photos here.

Note that direct, straight on photos at a close distance do not tend to be very flattering.

While foreshortening at a typical camera distance (at least 15’) will tend to make faces look flatter (why why 85mm is the historical minimum for portraits in 35mm) Straight on with a wide angle will dramatically increase the apparent size of your nose.
While I am sure Apple and other companies are working on this technology it is most likely in an attempt to grow the screen to the edge of the device. They will still probably offset the camera to reduce the impact of this effect which will highlight double chins and make for large noses with prominent nose hair. (Look at your own nose in the mirror, for most people a direct on shot will be looking up the nose)

Here is a link showing the effect I noted above, the looping video in the top post will demonstrate it better than 1000s of words can.

Problem about having a camera in the center of the screen… is that is where most people put the stuff they are working on. So, to preserve the image quality would be an issue. And, I don’t think the technology exists nor would it be possible to create such a tiny lens, glass, diamond, pinhole, or otherwise.

Originally, I thought of maybe a photon sieve type approach, but again, preserving the IQ in the center of the screen.