WebMCP is available for early preview

(developer.chrome.com)

81 points | by andsoitis 2 hours ago

9 comments

  • BeefySwain 1 hour ago
    Can someone explain what the hell is going on here?

    Do websites want to prevent automated tooling, as indicated by everyone putting everything behind Cloudfare and CAPTCHAs since forever, or do websites want you to be able to automate things? Because I don't see how you can have both.

    If I'm using Selenium it's a problem, but if I'm using Claude it's fine??

    • avaer 19 minutes ago
      In a nutshell: Google wants your websites to be more easily used by the agents they are putting in the browser and other products.

      They own the user layer and models, and get to decide if your product will be used.

      Think search monopoly, except your site doesn't even exist as far as users are concerned, it's only used via an agent, and only if Google allows.

      The work of implementing this is on you. Google is building the hooks into the browser for you to do it; that's WebMCP.

      It's all opaque; any oopsies/dark patterns will be blamed on the AI. The profits (and future ad revenue charged for sites to show up on the LLM's radar) will be claimed by Google.

      The other AI companies are on board with this plan. Any questions?

    • akersten 39 minutes ago
      I'm old enough to remember discussions around the meaning of `User-Agent` and why it was important that we include it in HTTP headers. Back before it was locked to `Chromium (Gecko; Mozilla 4.0/NetScape; 147.01 ...)`. We talked about a magical future where your PDA, car, or autonomous toaster could be browsing the web on your behalf, and consuming (or not consuming) the delivered HTML as necessary. Back when we named it "user agent" on purpose. AI tooling can finally realize this for the Web, but it's a shame that so many companies who built their empires on the shoulders of those visionaries think the only valid way to browse is with a human-eyeball-to-server chain of trust.
      • nkassis 9 minutes ago
        Just like then we were naive about folks not abusing these things to the point of making everyone need to block them to oblivion. I think we are relearning these lessons 30 years later.
      • cameldrv 32 minutes ago
        Me too but it died when ads became the currency of the web. If the reason the site exists is to use ads, they’re not going to let you use an user agent that doesn’t display the ads.
        • akersten 10 minutes ago
          > If the reason the site exists is to use ads, they’re not going to let you use an user agent that doesn’t display the ads.

          They've been giving it the old college try for the better part of two decades and the only website I've had to train myself not to visit is Twitch, whose ads have invaded my sightline one time too many, and I conceded that particular adblocking battle. I don't get the sense that it's high on the priority list for most sites out there (knock on wood).

    • victorbjorklund 1 hour ago
      They wanna let you use the service the way they want.

      An e-commerce? Wanna automate buying your stuff - probably something they wanna allow under controlled forms

      Wanna scrape the site to compare prices? Maybe less so.

      • candiddevmike 32 minutes ago
        A brave new world for fraud and returns.

        Also I just recently noticed Chrome now has a Klarna/BNPL thing as a built in payments option that I never asked for...

    • loveparade 1 hour ago
      Not fine if you use Claude. But it's fine if you are Google Flights and the user uses Gemini. The paid version of course.
    • chrash 1 hour ago
      i’m seeing this at my corporate software job now. that service that you used to have security and product approval for to even read their Swagger doc has an MCP server you can install with 2 clicks.
      • politelemon 57 minutes ago
        Sometimes, it gets added there without your consent.
    • dawnerd 36 minutes ago
      And what site is going to open their api up to everyone? Document endpoints already exist, why make it more complicated.
    • nojs 43 minutes ago
      It’s weirder than that. There is a surge of companies working on how to provide automated access to things like payments, email, signup flows, etc to *Claw.
    • OsrsNeedsf2P 50 minutes ago
      These are obviously different people you're talking about here
    • BeefySwain 1 hour ago
      Also, as someone who has tried to build tools that automate finding flights, The existing players in the space have made it nearly impossible to do. But now Google is just going to open the door for it?
    • jmalicki 1 hour ago
      In early experiments with the Claude Chrome extension Google sites detected Claude and blocked it too. Shrug
    • parhamn 1 hour ago
      Is the website Stripe or NYTimes?
    • moron4hire 30 minutes ago
      Oh, that's an easy one. LLMs have made people lose their god damned minds. It makes sense when you think about it as breaking a few eggs to get to the promised land omelette of laying off the development staff.
    • nudpiedo 1 hour ago
      They will wish that you use an official API, follow the funnel they settled for you, and make purchases no matter how
    • buzzerbetrayed 1 hour ago
      Why should a browser care about how websites want you to use them?
    • manveerc 1 hour ago
      In my opinion sites that want agent access should expose server-side MCP, server owns the tools, no browser middleman. Already works today.

      Sites that don’t want it will keep blocking. WebMCP doesn’t change that.

      Your point about selenium is absolutely right. WebMCP is an unnecessary standard. Same developer effort as server-side MCP but routed through the browser, creating a copy that drifts from the actual UI. For the long tail that won’t build any agent interface, the browser should just get smarter at reading what’s already there.

      Wrote about it here: https://open.substack.com/pub/manveerc/p/webmcp-false-econom...

      • arjunchint 1 hour ago
        So... an API?

        Most sites don't want to expose APIs or care enough about setup and maintenance of said API.

        • manveerc 1 hour ago
          Are you asking if Agents should use API?
  • paraknight 1 hour ago
    I suspect people will get pretty riled up in the comments. This is fine folks. More people will make their stuff machine-accessible and that's a good thing even if MCP won't last or if it's like VHS -- yes Betamax was better, but VHS pushed home video.
  • yk 57 minutes ago
    Hey, it's the semantic web, but with ~~XML~~, ~~AJAX~~, ~~Blockchain~~, Ai!

    Well, it has precisely the problem of the semantic web, it asks the website to declare in a machine readable format what the website does. Now, llms are kinda the tool to interface to everybody using a somewhat different standard, and this doesn't need everybody to hop on the bandwagon, so perhaps this is the time where it is different.

    • ekjhgkejhgk 13 minutes ago
      There's nothing wrong with XML.
    • koolala 43 minutes ago
      Are AI smart enough to automatically generate semantics now? Vibe semantics? Or would they be Slop semantics?
  • varenc 57 minutes ago
  • arjunchint 1 hour ago
    Majority of sites don't even expose accessibility functionalities, and for WebMCP you have to expose and maintain internal APIs per page. This opens the site up to abuse/scraping/etc.

    Thats why I dont see this standard going to takeoff.

    Google put it out there to see uptake. Its really fun to talk about but will be forgotten by end of year is my hot take.

    Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works. This is the thesis for Rover (rover.rtrvr.ai), our embeddable web agent with which any site can add a web agent that can type/click/fill by just adding a script tag.

    • ok_dad 41 minutes ago
      This isn’t even MCP, it’s just tools. If it were real MCP of definitely have fun using the “sampling” feature of MCP with people who visit my site…

      IYKYK

    • jauntywundrkind 40 minutes ago
      > for WebMCP you have to expose and maintain internal APIs per page

      Perhaps. I think an API for the session is probably the root concern. Page specific is nice to have.

      You say it like it's a bad thing. But ideally this also brings clarity & purpose to your own API design too! Ideally there is conjunct purpose! And perhaps shared mechanism!

      > This opens the site up to abuse/scraping/etc.

      In general it bothers me that this is regarded as a problem at all. In principle, sites that try to clickjack & prevent people from downloading images or whatever have been with us for decades. Trying to keep users from seeing what data they want is, generally, not something I favor.

      I'd like to see some positive reward cycles begin, where sites let users do more, enable them to get what they want more quickly, in ways that work better for them.

      The web is so unique in that users often can reject being corralled and cajoled. That they have some choice. A lot of businesses being the old app-centric "we determine the user experience" ego to the web when they work, but, imo, there's such a symbiosis to be won by both parties by actually enhancing user agency, rather than this war against your most engaged users.

      This also could be a great way to avoid scraping and abuse, by offering a better system of access so people don't feel like they need to scrape your site to get what they want.

      > Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works

      For someone who just was talking about abuse, this seems like a surprising idea. Your site running its own agent is going to take a lot of resources!! Insuring those resources go to what is mutually beneficial to you both seems... difficult.

      It also, imo, misses the idea of what MCP is. MCP is a tool calling system, and usually, it's not just one tool involved! If an agent is using webmcp to send contacts from one MCP system into a party planning webmcp, that whole flow is interesting and compelling because the agent can orchestrate across multiple systems.

      Trying to build your own agent is, broadly, imo, a terrible idea, that will never allow the user to wield the connected agency they would want to be bringing. What's so exciting an interesting about the agent age is that the walls and borders of software are crumbling down, and software is intertwingularizing, is soft & malleable again. You need to meet users & agents where they are at, if you want to participate in this new age of software.

      • arjunchint 6 minutes ago
        > You say it like it's a bad thing. But ideally this also brings clarity & purpose to your own API design too! Ideally there is conjunct purpose! And perhaps shared mechanism!

        I update my website multiple times a day. I want to have as much decoupling as possible. Everytime I update internal API, I dont want to think of having to also update this WebMCP config.

        Basically I have to put in work setting up WebMCP, so that Google can have a better agent that disintermediates my site.

        > Trying to keep users from seeing what data they want is, generally, not something I favor.

        This is literally the whole cat and mouse game of scraping and web automation, sites clearly want to protect their moat and differentiators. LinkedIn/X/Google literally sue people for scraping, I don't think they themselves are going to package all this data as a WebMCP endpoint for easy scraping.

        Regardless of your preferences/ideals, the ecosystem is not going to change overnight due to hype about agents.

        > Your site running its own agent is going to take a lot of resources

        A lot of sites already expose chatbots, its trivial to rate limit and captcha on abuse detection

      • candiddevmike 1 minute ago
        But we have OpenAPI at home
    • lloydatkinson 1 hour ago
      Sadly I do see this slop taking off purely because something something AI, investors, shareholders, hype. I mean even the Chrome devtools now push AI in my face at least once a week, so the slop has saturated all the layers.

      They don't give a fuck about accessibility unless it results in fines. Otherwise it's totally invisible to them. AI on the other hand is everywhere at the moment.

  • 827a 59 minutes ago
    Advancing capability in the models themselves should be expected to eat alive every helpful harness you create to improve its capabilities.
  • jauntywundrkind 50 minutes ago
    I actually think webmcp is incredibly smart & good (giving users agency over what's happening on the page is a giant leap forward for users vs exposing APIs).

    But this post frustrates the hell out of me. There's no code! An incredibly brief barely technical run-down of declarative vs imperative is the bulk of the "technical" content. No follow up links even!

    I find this developer.chrome.com post to be broadly insulting. It has no on-ramps for developers.

  • aplomb1026 6 minutes ago
    [dead]
  • whywhywhywhy 2 hours ago
    >Users could more easily get the exact flights they want

    Can we stop pretending this is an issue anyone has ever had.

    • thayne 1 hour ago
      Well I have had the problem of "I want to find the cheapest flight that leaves during this range of dates, and returns during this range of dates, but isn't early in the morning or late at night, and includes additional fees for the luggage I need in the price comparison" and current search tools can't do that very well. I'm not very optimistic WebMCP would solve that though.
      • trollbridge 52 minutes ago
        matrix.ita does this very well, and has been doing so for nearly 3 decades.
        • ekjhgkejhgk 9 minutes ago
          Do you mean this website? https://matrix.itasoftware.com

          I dind't know about it, just checked it out for a flight I'll buy soon, and has almost no direct flights which I know exist because they're on skyscanner...

    • qwertox 1 hour ago
      I want my local dm shop to offer me their product info as copyable markdown, ingredient list, and other health related information. This could be a way to automate it.
      • arcanemachiner 1 hour ago
        Since you didn't say what a "dm shop" is, I'll assume you mean "dungeon master shop" where you buy Dungeons and Dragons-y stuff.

        Or maybe it's a "direct marketing shop", where you bring flyers to be delivered into people's mail? Yeah, that must be it.

        • Sophira 1 hour ago
          Given that it's about food or medicine somehow, because of the mention of ingredients lists and health-related information, it's probably https://en.wikipedia.org/wiki/Dm-drogerie_markt (usually abbreviated "dm").

          (I didn't know about that either before now.)

      • echoangle 1 hour ago
        Why would you want that over a proper API with structured data?
      • larrymcp 1 hour ago
        He probably means the large German drug store chain called DM.

        https://www.dm.de/

      • Lord_Zero 1 hour ago
        dm?