• Cyborg Bytes
  • Posts
  • Did You Know Your Browser Can Guess What You’re Attracted To?

Did You Know Your Browser Can Guess What You’re Attracted To?

Your Webcam Isn’t Watching You. It’s Watching What You’re Watching.

Let’s kill a myth right now:
Your webcam isn't just watching your face.

It’s watching your attention.

Not what you click.
Not what you say.
Not even your facial expressions.

It’s watching where your eyes go—and in that motion, it’s decoding your mind.

Because in 2025, we’ve crossed a line: basic, everyday webcams—no special gear, no wires strapped to your face—can now map your gaze in real time, with a level of precision that used to require lab-grade hardware.

This isn’t face tracking. It’s cognitive telemetry.

It’s a silent contract you never signed. A surveillance layer embedded in the same lens you use for Zoom, FaceTime, TikTok. A camera that now doubles as a psychometric scanner. A digital MRI that reads your visual cortex through your pupils.

Here’s how it works:
Your webcam records your face, frame by frame. Algorithms trained on tens of thousands of eye images analyze the video to extract eye landmarks—the center of the pupil, the corners of your eyes, the curvature of your eyelid. Then, using appearance-based machine learning models, it reconstructs the vector of your gaze—where you’re looking on the screen, and for how long.

Every flick. Every pause. Every regression back to a word you didn’t understand. Every second you linger on something you want but won’t admit to. Logged. Timestamped. Structured into gaze data: coordinates, durations, dilation patterns, blink rate, scanpaths.

Your attention becomes machine-readable.

And suddenly, the invisible becomes visible.

Companies don’t need to wonder what parts of their site are working anymore—they can literally watch your thought process unfold through your pupils. Researchers don’t need to ask you how you felt watching a video—they can measure your subconscious response in milliseconds.

Want to know what this looks like in practice?

WebGazer.js: A browser-based eye tracker built at Brown that runs on raw webcam data using only JavaScript. No plugins nor server connection. Just pure, in-browser gaze tracking.

Here’s what that means:
Your gaze is now an interface.
And your eyes are a cursor you can’t turn off.

You can move your mouse away. You can stop typing. You can even lie with your face.
But your gaze? It betrays you—every time.

This week inside the Neurospicy Cyber Club, I dropped a 60-page ebook that goes deep into the technical guts of eye tracking—the same research I used to teach myself how all this works.
From machine learning models to biometric surveillance tactics, it’s a dense, citation-heavy breakdown for people who like to go way past the headlines. → Join now to get access

Could Your Eye Movements Be Used as a Password… or Evidence?

Here’s where it gets uncomfortable.

You think of your gaze as private. Passive. Harmless.
But eye movements are biometric signals. And they’re shockingly revealing.

Let’s go deeper:

Pupil dilation reveals emotional arousal and cognitive load. If you’re solving a tough problem, watching something exciting, or feeling stress, your pupils expand. You don’t choose that—it’s autonomic. A direct line from your brainstem to the light entering your eyes.

Arousal? Same story. In a Cornell study, researchers used pupil dilation alone to accurately detect participants' sexual orientation, just by showing videos. No words or confessions. Just involuntary ocular reactions.

Fixations and regressions reveal your comprehension. If you re-read a word or sentence, your eyes trace a regression path. This is how reading disabilities like dyslexia are diagnosed—in the wild.

Blink rate correlates with anxiety, stress, deception. Liars blink more. Nervous job candidates blink more. People in high-stress simulations blink more. Some eye-tracking systems now use this to flag high-stress moments in training, UX testing, even policing scenarios.

Scanpaths show your internal thought process. How you scan a face, how you process a product page, how your attention navigates a video frame—it’s all recorded. And it’s unique. Like a fingerprint. Meaning…

You can actually be identified by your gaze.

Let that sink in:
Your eye movement patterns—how fast you scan, how long you fixate, where you tend to look—can be used to verify your identity. Studies show these patterns are distinct enough to serve as a biometric signature. Not just a password—but a passive one. A behavioral ID you can’t change, revoke, or reset.

Now imagine this:

  • You walk into a VR simulation.

  • The system recognizes you—not by face or voice—but by how you look around.

  • You start typing a password with your eyes, using a gaze-based keyboard.

  • A remote attacker, using a technique called GAZEploit, captures your eye movements and reconstructs what you typed with over 80% accuracy.

Your gaze just leaked your password.

But it gets worse.

In law enforcement and national security, there’s a method called the Guilty Knowledge Test. It presents suspects with images—some relevant to a crime, some random. You don’t have to answer anything. You don’t even have to react.

Your eyes will.
You can’t help it.

Studies show that even trained individuals trying to suppress recognition still show longer fixations on familiar faces, objects, or words. The system reads that hesitation, that split-second attention spike, as cognitive recognition.

Eye tracking is being used as mind-reading, by proxy.

And you don’t have to ever say a word.

So yes—your eye movements could be used as a password.
They could also be used as evidence.
And in the wrong hands, they could become a backdoor into your psyche.

This is more than surveillance.
This is pre-crime telemetry.
This is biometric profiling layered on top of every screen you look at.

So let me ask again:

  • What if a webpage knew what paragraph you couldn’t stop reading?

  • What if your phone could detect your attraction before you swipe?

  • What if your employer used your blink rate to decide you weren’t focused?

  • What if your gaze profile became your digital identity?

Your eyes aren’t just revealing what you’re looking at.
They’re revealing what’s looking back at you.

If This Tech Can Be Weaponized, Why Are Companies Building It Into Everything?

Short answer: control.
Longer answer: data, monetization, behavior prediction, and UI dominance.

Let’s name names.

Meta bought The Eye Tribe in 2016 and launched the Quest Pro in 2022 with full eye and face tracking. Their documentation now includes an “Eye Tracking Privacy Notice” stating gaze data may be used to “personalize experiences.” Translation: if your eyes linger on a virtual hoodie, expect to see it again. Everywhere.

Apple acquired SensoMotoric Instruments—one of the world’s top eye tracking companies—and baked that tech directly into the Vision Pro. But don’t be fooled by their polished privacy messaging. Eye tracking is a cornerstone of Apple’s AR/VR interface model: look, tap fingers, done. That’s not UI—it’s pre-emptive interaction. They don’t just want to know what you’re clicking. They want to know what you’re about to click, before you even act.

Google filed a “Pay-Per-Gaze” patent over a decade ago. The premise? Charge advertisers based on how long your eyes fixate on an ad—whether on screen or in real-world AR. Combine this with their ad empire and Android control, and it’s not hard to imagine a future where eye-based behavioral targeting becomes the baseline.

Samsung launched “Smart Stay” as early as 2013: a feature that keeps your phone screen on as long as you’re looking at it. Basic, yes—but it planted the seed. Phones are already watching for your gaze. They just haven’t told you how much they’re logging.

And then there’s the real curveball:

WebGazer.js—an academic tool built to democratize eye tracking by running entirely in-browser, using only your webcam. It calibrates itself using your clicks and cursor movements. Once it knows how you look at certain points on the screen, it can infer the rest—without needing to phone home.

If a free JavaScript library can do that, imagine what proprietary ML models from trillion-dollar tech corps are capable of.

Here’s why this matters:

Whoever owns the gaze stream doesn’t just know what you’re seeing.
They know what you're paying attention to.
They know when you’re confused.
They know when you hesitate.
They know when you’re about to act—but haven’t yet.

This flips the entire interface model on its head. You’re not interacting with the screen anymore.
You’re being watched by it.
And every micro-movement is being logged, analyzed, optimized, or manipulated in real time.

Still wonder why cybersecurity pros cover their webcams?

What If Your Gaze Is Already Generating Profit?

You might think this data stays inside lab experiments, UX teams, or academic papers.

It doesn’t.

Welcome to the gaze economy—where attention has a price tag, and your pupils are part of the product.

Let’s break down how this works.

User testing firms use eye tracking to create heatmaps—aggregated views of where people look first, longest, and last. This reveals design flaws, content blind spots, and high-value screen real estate. Eye movement data is used to optimize button placement, reduce bounce rates, and predict drop-off zones in onboarding flows.

But that’s small potatoes.

In advertising, it gets way darker. Eye tracking bypasses click rates and focuses on the real question: did you actually see the ad? Not scroll past it. Not skim it. See it. Advertisers want to buy time on your retinas, not just your feed.

Enter pay-per-gaze.

Google filed a patent for it in 2013: advertisers only pay when your eyes land on an ad, with potential modifiers based on pupil dilation (read: emotional response). Combine that with wearable AR, and you’ve got the blueprint for a marketplace where milliseconds of attention are auctioned in real time.

Meta is already playing this game. Their Quest Pro headset tracks your gaze and uses that data to determine what captured your attention in a 3D environment. Did you stare at a virtual sneaker on a shelf for more than two seconds? That interest gets logged. Maybe the next shelf has three pairs instead of one. Maybe the price drops in real time. Maybe that same sneaker stalks you across the internet afterward.

This isn’t theory. Meta’s privacy policy explicitly allows for gaze data to be used for engagement analysis. That means your eye movements are being tested as a feedback loop for content, advertising, and behavioral nudging.

And this is just the beginning.

E-commerce sites are already experimenting with adaptive layouts—where the page subtly reshuffles depending on where your eyes linger. See an item twice but don’t click? It might grow larger. Look away from a popup too fast? It might get stickier next time.

This is behavioral A/B testing on steroids. It’s real-time UX personalization, driven by subconscious micro-movements you didn’t even know you made.

Your attention is the product now.
Your gaze is the invoice.
And someone, somewhere, is already collecting.

Who’s Behind the Curtain—And What’s In It for Them?

This isn’t just the work of one creepy startup. It’s an entire ecosystem—stacked, well-funded, and racing to own the infrastructure of human attention.

Start with the eye tracking giants:

  • Tobii: Dominant in gaming, research, and now enterprise VR. Their SDK lets developers integrate gaze tracking into games, simulations, and apps. Their mission? “A world where all technology understands your attention.” Yiiiiikes on bikes.

  • SR Research: Behind the EyeLink 1000. Gold standard in academic research. Now licensing its systems for UX labs and military studies.

  • Pupil Labs: Open-source eye tracking for AR, HCI research, and remote labs. Making it cheaper, faster, more plug-and-play for developers to build systems that capture gaze data in the wild.

Then we’ve got Big Tech:

  • Meta: Bought The Eye Tribe. Built eye tracking into every VR headset they sell. Pushing for gaze-based control in Horizon Worlds. Explicitly exploring ads that respond to eye engagement.

  • Apple: Acquired SMI in 2017. Rolled that tech into Vision Pro, using gaze as the primary input mechanism. Also filed patents for gaze-based interaction with smart TVs, tablets, and even cars.

  • Google: Owns the patent for pay-per-gaze. Bought Eyefluence. Experimented with eye tracking in Glass and beyond. Combine this with Android’s camera permissions and Chrome’s browser dominance, and they’ve got the distribution channel for gaze tracking at planetary scale.

  • Microsoft: Building eye tracking into HoloLens. Already testing training programs that flag employees who “miss” visual safety checkpoints. It’s not just AR—it’s compliance surveillance, gaze-based productivity metrics, and enterprise control.

  • Amazon: Quiet, but not absent. Cameras in Go stores track hand and eye movement. Smart shelf systems monitor where your gaze lingers and what you don’t pick up. Combine that with Alexa’s screen-based devices and you’ve got a silent eye-to-cart pipeline.

Everyone is converging on the same goal:
Make gaze the next input.
And make sure they own the stream.

Because whoever controls the gaze layer doesn’t just know what people see.
They know what people choose.
And ultimately, they get to shape what choices even exist.

This isn’t about UX anymore.
It’s about who gets to filter, frame, and feed the future of attention.

Your Phone Is Watching Too—And So Is TikTok

Don’t let the laptop fool you. Gaze tracking isn’t limited to desktop webcams or VR headsets. Your phone’s front-facing camera—the one you use for selfies, Face ID, and checking if there’s something in your teeth—is also watching.

And social platforms are already experimenting.

TikTok has quietly tested eye-tracking analytics to measure what parts of a video you actually watch. Not just play time—where your eyes land. Whether you fixate on a creator’s face, on the caption, or the “buy now” link. That data is used to refine its hyper-addictive algorithm with surgical precision. You're not just the viewer—you’re the training set.

Instagram, too, has explored integrating gaze-based metrics for Reels and Stories. Eye tracking lets them know which parts of a video held your attention—even if you didn’t tap, like, or comment. That’s signal gold. It turns every scroll into a heatmap. Every pause into proof of engagement. Every blink into behavioral feedback.

This isn’t speculation—it’s trajectory.

Your phone already knows where your fingers go. Now it’s learning where your eyes do. And that’s a whole new class of consent you’re not being asked for.

Because if your face is visible, your gaze is trackable. And if your gaze is trackable, your attention is for sale.

Can You Fool the System Watching Your Eyes?

If eye tracking is a one-way mirror, the next logical question is this: can you break the glass?

The answer is yes—but it takes strategy. This isn’t ad-blocking. This is optical counter-surveillance. You’re not disabling the machine. You’re misleading it.

Let’s start with the basics: cover your camera. Revoke webcam/camera permissions. If a site or app doesn’t need your face, it doesn’t need your gaze either. That alone blocks most passive tracking.

But if you’re still showing up on video—or inside a headset—then it’s time to move beyond blocking. You don’t just want to hide. You want to mess with the signal.

Let’s taslk spoofing.

Cursor spoofing is one of the oldest tricks. It works by sending false signals to gaze tracking systems that piggyback off your mouse to calibrate attention. You move your cursor over elements on the page that you didn’t look at. Or you stare at something without hovering. Done long enough, and the data becomes noisy, less reliable, harder to monetize.

But that’s just noise. What about total obfuscation?

Enter gaze jammers—software overlays that break the link between your real gaze and what’s being tracked. Some use screen filters that slightly shift visual content so your fixations fall just outside tracked zones. Others inject microanimations or pixel jittering to trick the system into thinking you’re not staring at anything long enough to register interest.

In VR and AR environments, the arms race is even more intense. Some researchers have developed eye blockers—virtual objects that intercept gaze before it hits a sensitive zone, like a password field. Others use decoy reticles, subtle eye-catching shapes designed to redirect your visual attention momentarily and confuse predictive models.

Want to go hardware-level?

There are IR-blocking glasses that scatter or absorb infrared light, rendering corneal reflections unreadable to PCCR-based trackers. Some prototypes even use active emitters to flood the camera with false glints, creating an overload of tracking data.

On the software side, privacy-aware browsers are starting to emerge. Extensions that detect embedded eye tracking scripts and block them like trackers. It’s early-stage, but the demand is growing.

And here’s the simplest, most effective tool of all:
Awareness.

Just knowing when and where gaze is being tracked puts you back in the loop. The human brain adapts. You can change how you read. You can slow your scanning. You can break patterns. Even a small tweak in behavior—like consciously blinking when you read an ad—can distort data enough to devalue it.

Because resistance doesn’t always mean stopping the system. Sometimes, it means starving it.

You don’t need to vanish.
You just need to become unpredictable.

That’s how you turn surveillance into static.

What Happens When Your Eyes Replace Your Hands?

Let’s close the loop.

This isn’t just about privacy. Or profit. Or creepy VR headsets logging your every glance.

This is about the interface itself. The entire paradigm of interaction is shifting—away from clicks and taps and toward something far more intimate:

Your gaze as input.

In the old world, your cursor was the proxy for intention. In the new one, it’s your eyes. And they move 3–4 times every second. That’s a whole different speed, a whole different level of control, and a whole different class of vulnerability.

Because when you look becomes when you act.
And when you act becomes what you become.

The psychological firewall between thought and action starts to dissolve. UX collapses into reflex. Feedback loops get tighter. You don’t even notice you’ve made a choice—until the interface responds.

And when your gaze controls what appears, what gets prioritized, what gets remembered…
Then whoever owns the gaze stream owns the future interface layer.

This is why Apple, Meta, Google, and Microsoft are racing to get there first. Not just to make things easier—but to reshape what “choice” even means in a screen-based world.

So what do we do now?

We fight back. With tools. With knowledge. With friction.

We break the spell.

Ask yourself this:

If your gaze is now the default input…
What happens when your attention becomes a form of consent?

Because if you’re not protecting your gaze,
you’re already giving it away.

Stay Curious,

Addie LaMarr

P.S. 📚 The newest drop inside the Cyber Club unpacks everything this newsletter couldn’t fit: the academic studies, corporate patents, surveillance use cases, and all the hidden mechanics behind attention tracking.
No bullshit. Just the real technical research trail I followed to understand how this tech works—and where it’s going next.
Grab the full ebook in the Neurospicy Cyber Club