Jump to content

Mathwiz

Member
  • Posts

    1,851
  • Joined

  • Last visited

  • Days Won

    50
  • Donations

    0.00 USD 
  • Country

    United States

Everything posted by Mathwiz

  1. Unfortunately the Thorium author has not released a new version of Thorium since v.122. But for a user agent to make Thorium "look" newer, you could try adding --user-agent="Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/132.0.5047.196 Chrome/132.0.5047.196 Safari/537.36" ... to the end of the command in your Thorium shortcut. Any Web server looking at that will think you're using Chrome 132 on Ubuntu Linux. That may get you past "please update your browser" pages or nags at some sites.
  2. I no longer bother to determine Chase's minimum browser version, as there's very little point, once I find an SSUAO that works. But I used to. It never seemed to make any sense though. It would be just some random version or other that was "somewhat" older than the then-current ESR version for Firefox or Chrome. I always wondered if they chose minimum versions that had patched some specific security flaw they were worried about, but who knows? Still, the currently supported version should always work, at least as long as you remove the "R3dfox" bit, or any other clues that you aren't using genuine Firefox / Chrome.
  3. This page explains what some of the "referer" prefs do: https://notepad.patheticcockroach.com/4256/tweaking-referer-settings-in-firefox-and-tor-browser/
  4. And another surprise. I set referer.trimmingPolicy to 2 on my work PC, then went to do the same thing on my home PC - and my home PC was already set to 2! Where did that come from? Turns out it's in the "UOC Patch" - a set of preferences intended to improve performance, developed by @looking4awayout long ago. Since this particular setting has little to do with performance, I assume he set it for privacy reasons, and it found its way into his UOC Patch by accident. Apparently he was ahead of his time, since it's now the default setting in newer FF versions.
  5. Fascinating. The setting was always there (since FF 28 or so); Mo just changed the default.
  6. The "Phishing URL Blocklist" broke AVSForum.com today. I don't know why but suddenly it's blocking all CSS URLs with an ampersand (&), which made a complete mess of AVSForum.com. Weird. Turned it off in the UBO Legacy dashboard and AVSForum is good again.
  7. The change to browser behavior makes some sense: The Referer header was always an information leak, so the change improves privacy when following links (the target doesn't know where you came from). But if you're right, CloudFlare is abusing that change to block older browsers. Hopefully either MCP or @roytam1 can develop a fix soon.
  8. We seem to have drifted off topic a bit here. IIRC, the thread was originally about Micro$oft Copilot invading our privacy on PCs running Windows 11. I haven't even tried to use Copilot for anything, but for those of us forced to use Windows 11 at work, is there any way to avoid or block this BS?
  9. I still use WMC, even in 2025, with EPG123: Of course I've been using it for a really long time, like ten years. I don't know how hard it would be to set it up again from scratch. I agree that it's a shame Microsoft abandoned this software. It was included in Windows 8, but you had to buy a key for $10 from Microsoft to unlock it, and there were no improvements between WMC 7 and WMC 8. Windows 10 abandoned it completely, although there are unofficial hacks to get it working on Windows 10. (Don't know for sure but they probably work on Windows 11 too.) As for tuners, I would probably go with a used SiliconDust HDHomeRun. That plugs into your home network, so you aren't locked into using it with just your PC, if you decide WMC isn't the solution for you. You would need the correct HDHR version for your country, since TV standards differ across the globe.
  10. Interesting find. Can you narrow down the version that broke rt.com? There are only three versions in between. Also send a screen shot of the crash notification, so folks have some idea where the breakage is. And last but not least, if possible try rt.com on the latest official Basilisk (requires Win 7 so you may have to borrow a PC). Of course official releases are somewhat behind these test releases, so it may work there; but if it doesn't you can report it to @basilisk-dev and help more users.
  11. We need a "benign exploit" page (a page that triggers the bug but doesn't do anything harmful) to test for this vulnerability. We had one for the WebP vulnerability.
  12. You are right. You need version 138 or above to get the patch. If folks don't want to update, the patch is unavailable to them. For those folks, the only safe option is to turn off the V8 optimizer as described previously. I suppose, in theory, someone skilled in building Chromium could apply the patch to earlier versions, but I can't imagine anyone would do so, unless there were a very popular old version that many folks were reluctant to update from.
  13. Version 138 is required for the fix; the bug goes back earlier though: Good catch. Google is being tight-lipped on exactly when this vulnerability crept in. I doubt it goes all the way back to 2008, though. Today's V8 looks nothing like the original. I believe (and should have said) versions prior to the V8 optimizer are not vulnerable. I suspect 360EE (and Kafan MiniBrowser) aren't vulnerable because the option to turn off the optimizer isn't there (presumably because there's nothing to turn off), but I can't be sure with the limited info we have.
  14. It's well hidden: Settings / Privacy and Security / Manage V8 Security (near bottom of page - scroll down) / Don't allow sites to use the V8 optimizer (This will slow down Javascript) Really old Chromium versions (360EE) don't have V8 and so are (presumably) not vulnerable
  15. (Actually Moonchild said:) Good; so the "collective punishment" of being banned for living in the wrong country will end soon, hopefully. MC is wrong about one thing though: As noted here, Anubis unfortunately does require one more thing beyond being "a little patient the first time they visit:" turning off certain privacy guards. MC himself won't abuse this requirement: ... but other Anubis-protected sites may not be so civic-minded, and how's the end user supposed to know? One user presented a possible workaround though: I don't know if MC has Anubis configured this way, but those outside the geoblocks may experiment at their leisure.
  16. Yes; the page could've been clearer on exactly how "modern" your browser's Javascript needed to be. At any rate, UXP does seem up to the task, albeit inefficiently. There are many reasons that might have caused me to get the "denied" page, but it wasn't worth the effort to track it down. I was just wondering what kind of nonsense we WWW users have to deal with now, and why. My curiosity is "mostly" satisfied now.
  17. If you take the Anubis explanation (posted above by @VistaLover ) at its word, it seems to make sense. The idea is to make the user agent (browser or bot) do something rather hard, but not too hard; the idea being if you're just an ordinary user, the extra work is just a short delay in getting to the Web page; but if you're a bot crawling millions of pages, that extra work isn't worth the effort so you'll just abort the script after a few milliseconds and move on. But, then - why insist on "modern" Javascript and why force users to disable their privacy guards? I'm still somewhat skeptical that Anubis was telling us the whole story above.
  18. So it is a bandwidth issue. Fair enough. I had no idea that AI crawling had become such a burden for Web servers. Still having a hard time grokking why the AI crawlers don't respect robots.txt though. AIUI, their purpose is just to gather content to train AI engines; surely there's plenty of content even without violating such a longstanding norm! In any case, I question Anubis's assertion that "The idea is that at individual scales the additional load is ignorable." It took R3dfox v.139 several seconds to complete the challenge, to say nothing of UXP browsers. But I suppose there was a silver lining: MC probably had to ensure UXP could pass the challenge before using it to protect his own repo! It would be quite embarrassing if RPO couldn't be accessed by Pale Moon....
  19. I sort of figured, but why don't AI crawlers respect robots.txt, when other Web crawlers do? That's what I was really after. Which leads to another question: why do public repos need to block AI crawlers so badly that Gitea resorted to Anubis to do the job? Is it a bandwidth issue or a legal one?
  20. Anubis (from Egyptian mythology) was also the name of a villainous character on the Stargate SG-1 television series. AI crawling sounds bad but I'm not sure why, what it is, how it differs from ordinary Web crawling, or why robots.txt cannot be relied on.
  21. Unrelated to original problem, but WTF is this? FWIW, r3dfox passes whatever this is and lets you in (eventually). The WWW has become such an unpleasant place.
  22. TL;DR - The "Do Not Track" header was abandoned because virtually no Web sites paid any attention to it. Google may have made the decision to pull the plug but it was already comatose. It remains to be seen whether its successor, Global Privacy Control, will be similarly ignored. GPC may be enforceable in the EU, so sites may choose to honor it there.
  23. Good to hear. I use 55 myself, but I also keep a copy of 52 handy. The different names arose back in the early days. Originally Serpent was just a generic name used by "unbranded" builds of Basilisk; i.e., if you or I built the browser on our own PC, it would be called "Serpent" too. Basilisk was the "branded" name used for the official version distributed by MoonChild Productions. For several years the folks at MoonChild Productions expressed great irritation that some users sought support for @roytam1's Serpent builds at Basilisk's Web site (then run by MCP), so we were all taught to be careful never to refer to Roytam's builds as "Basilisk" and risk driving more Serpent users there.
  24. Discover.com works with r3dfox as-is, so no SSUAO needed there. (Edge or Thorium users aren't so lucky; Discover seems to demand a quite recent Chromium engine. Supermium would probably do the trick but I haven't tried it.) I see several sites with this SSUAO: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:128.0) Gecko/20100101 Firefox/128.0 This works with chase.com as well. So I guess it was the r3dfox bit that it didn't like after all. Win 7 doesn't seem to bother it.
  25. On Win 7, R3dfox is now my preferred replacement for M$ Edge. I had been using the latest Edge version for Win 7 (109), with a UAO to Chrome 125, but that's no longer good enough for some sites (e.g., discover.com). I did find that Chase.com doesn't like the R3dfox slice in the user agent - or was it the OS slice, revealing Win 7, that it was objecting to? It kept telling me to "upgrade" my browser even though R3dfox is up to version 139! Well, either way, a straight FF 128 on Win 10 user agent satisfies both Chase and Discover, at least for now. It's ridiculous how bloody finicky some Web sites - particularly financial ones - have become. Security I dig, but way too many folks equate "security" with "only using Chrome, Edge, or Firefox, and a version no older than a few months."
×
×
  • Create New...