Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won

  • Donations

    20.00 USD 
  • Country


RamonUn last won the day on April 5

RamonUn had the most liked content!

About RamonUn

Profile Information

  • OS
    2003 x86

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

RamonUn's Achievements



  1. Wow, had not seen this topic before, I must give this a try, seems to be amazing...
  2. @feodor2 I do confirm I am able to use 2FA with GitHub without giving them my phone number! You can click on Settings->Password And Authentication, then click the *enable 2FA button*, you will see a three step process were you first must get a key from GitHub, it will show a QR-code but you can actually click to see the code, then you can save the code (16 characters long) to a text file then use the python script or another program (MOS Authenticator also works fine) to generate a temporary key from the master key GitHub gave you.. Your clock must be accurately synced with real time (a few seconds off max). because the generated code will change every 30 seconds. I tried this on two different account and it worked fine. Of course maybe it is different in different countries. I would suggest anyway to try on a test account that you can delete later, just in case something goes wrong. This is what I did. I am not sure if this will remove the random extra e-mail confirmation, but maybe not. I will have to see this over the next few days or weeks. I hope it will work for you as it would simplify your life.
  3. Another TOTP generator in 20 lines of python, code is quite easy to review for yourself. https://github.com/susam/mintotp I have to try on some random GitHub account to see what works for me before applying to my main account.
  4. I think this definition is a bit hard, because software is usually built by the developer and he can chose to use newer/older windows or even cross compile from Linux or OS/2 or whatever. this does not mean the target OS is not really used. Newer MSVCs depend on newer Windows version but is not for real technical reasons but rather because of development cycles. MS has to drop old windows versions eventually, but keeping XP compatibility would not be a huge burden compared to creating new frameworks every two years. You can still use the latest GCC on Windows XP to this day and build C++23 programs. The really hardcore definition would be to only use programs that you built yourself like on Gentoo Linux and the most hardcore definition would be to only use software that you wrote yourself (including the compiler). I do agree with you that there is a purity loss but even VS2010 was probably built on Windows 7. So even like that would it count? We always depend indirectly on the past as well. No compilers would run on XP if it were not for compilers on older Windows. so Any XP users depends on older Windows. Also with this definition nobody is really using Windows 7 anymore because most programs are build on Windows 10/11. Soon none will be really using Win10 anymore as well. don't get me wrong I do not really disagree with your definition, I just find it a bit hardcore. Also I always felt Windows XP to be quite meaty so I do not feel vegetarian at all when using it, I feel carnivore...
  5. By real I was understanding using XP as a main OS, and obviously all the software I write is written on XP and runs on XP. I stick to XP because the hardware I use works just fine with it and newer OSes would be slower for most things. Also I dislike newer Windows several reasons that are outside the scope of this thread but I did use Win10 at work in the past and I really did not like it after some tweaking it was better but with the updates all my settings were reverted so I would end up disabling mot updates which kinda defeats the point of having an up-to date OS so it was not the best idea for me. I do not know what I will do when GitHub will force 2fa, probably I will migrate somewhere else.
  6. You can count me in, I do not even have a Win10 install, My main PC in on Windows Server 2003. I also have a PC on Windows 7 but I hardly ever use it. I also have an old PC with Win98SE but I do not use it much anymore Up until 2012 it was my main PC, At some point I do plan to buy a new PC and have Win 10 so I can test some software I am writing.
  7. It seems Google is re-considering adding JPEG-XL support in Chromium https://bugs.chromium.org/p/chromium/issues/detail?id=1451807 On this point at least we are a step forward.
  8. Nice to see more JS features implemented in the UXP engine. I thought that https://scoop.sh/ would work now but I still get a blank page and the `ReferenceError: BigUint64Array is not defined` error, with the latest Serpent build. Is it expected behavior? Maybe I should flip a flag to get the BigInt stuff? or this specific is not yet implemented? EDIT: Well I answered myself the question, the Big(U)Int64Array stuff have been merged just a few days ago and are not yet in Roy's builds...
  9. You can easily modify the latest cmake 3.26.4 using kernelxp.dll and ws2_xp.dll from xompie I did this on my computer and it seems to work fine.
  10. I agree 32bit OS should not be there anymore but I still do see them because of stupid OEM vendors that sell bad stuff cheaper and many people that have lillte budjet end up with this. Microsoft is willing to drop 64 bit since vista, but the vendors decided otherwise. I am unsure it would save much on CPU because the 1) the actual computing unit is a minority on a modern CPU, 2) 64 bit mode still allows you to use most 16/32bit instructions so they will be parsed almost the same. The only thing you are saving is some very specific circuits related to cpu modes, I am not a CPU designer however. Probably there would be advantages to drop support for 32bit mode on future CPU but at this point I think the x86 architecture is close to its end even if dropping 32bt mode were to save significant cost. Intel did take the good decision to abandon the avx512 instruction set that was HUGE and completely unused by 99.9% of programs for the simple reason that not enough CPU were equipped. It will most of the time but many people are not even using the compilers' warnings and even the warnings will not catch many things. for example I have seen people use %lld format with printf to print an address in 32bit mode you do get a warning but not in 64bit mode.or they use a win32 LPARAM cutting it in two 32 bit DWORDS and this fails if you are compiling in 32bit mode because LPARAM is 32bit only in 32bit mode. In this later case it would require a real change not just changing a type. May popular open-source projects however to their best. When I program I use all warnings, several compilers and I still make this kind of mistakes occasionally that I catch when building in 64bit mode, but I admit I am a terrible programmer.
  11. I do not like it much either, there are still plenty of 32bit Windows, I have even see several 32 bit Win10 laptops that were not up-gradable to 64 bit because of UEFI BIOS. So in the end I think there are more 32bit windows than Linux desktops. Even if all AMD/Intel CPUs have been 64bit since the early 2000s... Also I always saw the duality or architectures as a free sanitizer, you would be surprised on how many subtle bugs can be found by just building and testing in both 64bit and 32bit. There were a lot of buggy 32 bit programs that were hard to port to 64bit and now I see random programs that have a ton of warning simply when you build them for 32bit and some of them cannot even work. Bad coding habits still exist but have just migrated from :"I assume my CPU is 32bit so int == void* and stuff" to "I know my CPU is 64bit so uint64_t == void* and stuff". Windows 10 will be supported until 2025 (and possibly beyond), so dropping 32bit support is very strange because it is still maintained by Microsoft.
  12. Google's webp and avif are not bad image formats by themselves but they are inferior to JPEG-XL and lack the lossless JPEG re-compression. This is not strange I mean, the JPEG team has lot more expertise when it come to image compression algorithms than google. It is a shame that bing.com is using user-agent for content type negotiation, it is extremely bad practice. There is already an HTTP protocol for content negotiation specifically for this reason at least since HTTP1.1 HTTP1.0. There never has been any excuse to use UA sniffing to decide which image format should be delivered to a client. Edit it was introduced in HTTP1.0 (1996) and is not present on HTTP0.9 (1991) ref: https://www.w3.org/DesignIssues/Conneg
  13. I remember the fuzz when chrome decided to remove the JPEG XL support that was an experimental flag. I know some big companies including facebook really wanted the new format, because it would help significantly with bandwidth and without loss of quality because JPEG-XL has a way to compress classic JPEGs with out any loss, so It is a dream for anyone with a huge image base that are 99% JPEGs and can be re-compressed losslessly, by 20% (more or less). Also for newer images an even higher visual quality per bit can be achieved, so again any photo-store is interested by any ounce of better image compression. Safari adopting JPEG-XP might be enough to make some big website start using it (it should pay for itself in bandwidth saving over time). Of course content is negotiated and classic JPEGs would be given to unaware clients.
  14. Very cool, this is a significant plus for me, even if the web does not use the format there are times were I encounter such a file and it is always god to have more options o open jxl files. I guess we will we will have this next week?

  • Create New...