DULLARD ๐
Also a bonus code I won't forget:
ADE-NAI-WRA-LKA
You can all guess what games they are from.
DULLARD ๐
Also a bonus code I won't forget:
ADE-NAI-WRA-LKA
You can all guess what games they are from.
Erm... Ok
The consensus seems to be that AMD priced their cards higher expecting Nvidia to price higher than they did.
Then Nvidia priced lower than they expected (still too expensive imo) and AMD needed to react and price their card cheaper. Problem is retailers already paid for shipments so AMD needed to settle some sort of reimbursement process for the soon to be out of pocket retailers.
This was a big issue for them, but also they realised they could generate more frames if they wanted to, and match Nvidia so they would be able to also claim crazy high FPS figures (it's all nonsense, we care about raster performance).
To be able to do this they needed a couple of months to dev and test it before reviewers get it.
So delaying launch let's them solve both problems with the extra time, but in reality they are missing a window to gain market advantage while also being able to align the narrative with what gamers care about (pure raster performance).
Not answering your comment directly, and I don't even use Linux, BUT..
One reason a lot of us don't use Linux even if we really want to us because it's biggest strength is also one of its biggest weaknesses, that being it's modularity.
There isn't a single packaging system, window manager, file system, shell, etc etc.
This makes it hard for companies (and devs in general) to target Linux for releases. For example you want to release something for Windows, you build a single exe, apple is a dmg (I think) etc so you just build for one single platform with a consistent API.
When you want to build for Linux there can't be just one build/package. This has actively been cited as reasons why some commercial software doesn't support Linux, as it takes far more effort to support all major permutations of platform and package management.
So back to your question, why is Valve's Steam OS going to help? Because it's going to be a single platform with a single way of doing things. You can always go and replace the bits like any Linux distro but out the box it will be easy enough for vendors to support, it will hopefully also get more adoption because it has commercial support.
Look at Android as an example (I know it's not entirely the same), but that is just a customised version of Linux, but as it's consistent and has a single way to manage packages it's widely adopted.
I am pretty sure Linus himself said how one of the reasons why Linux desktop doesn't have mass adoption is because no one can agree on how things should be done, so we have hundreds of libs all doing the same thing in a different way. Valve will pick what they think is best (even if it isn't technically the best) and through that we all have a singular point of effort and adoption to centralise on.
Recently, I would say Roadwarden, was such a great game with such a unique feel to it.
What a gem, my friend and I played through the first 2 lunar games in university.
I also discovered wild Arms which was pretty good and doesn't get much attention compared to jrpgs like FF and Suikoden.
There have been some decent results historically with checkerboard and other separated reconstruction techniques. Nvidia was working on some new checkerboard approaches before they killed off SLI.
A decade or two ago most people I knew had dual GPUs, itbwas quite common for gamers and while you were not getting 100x utilisation it was enough to be noticeable and the prices were not mega bucks back then.
On your point of buying 1 card vs many, I get that, but it seems like we are reaching some limitations with monolithic dies. Shrinking chips seems to be getting far harder and costly, so to keep performance moving forward we are now jacking up the power throughput etc.
Anyway the point I'm trying to make is that it's going to become so costly to keep getting these more powerful monolithic gpus and their power requirements will keep going up, so if it's 2 mid range gpus for $500 each or 1 high end gpu for $1.5k with possibly higher power usage im not sure if it will be as much of a shoe in as you say.
Also if multi chiplet designs are already having to solve the problem of multiple gpu cores communicating and acting like one big one, maybe some of that R&D could benefit high level multi gpu setups.
It was some on board gpu with my super amazing AMD K6-2, it couldn't even run mega man X without chugging. Then a friend gave me an S3 Virge with a glorious 4mb vram.
I'm sure there is a simple answer and I'm an idiot, but given it's in a place that gets lots of sun, can they not just install solar panels with batteries at consumer/grid level?
Or is the problem not with the generation of the power and with transmitting it to properties? I don't know cost of solar installation but I'm sure the amount it's costing them when it all fails they could at least incentives individuals to install solar or something.
Really enjoying it so far.
I was initially saddened to hear it was going to follow in the steps of 15 and be an action based rpg, and I thought 15 was brain dead "warp strike simulator" with horrible story pacing and poor characters (until last 5% of the game).
This game though has simple but effective action combat with enough variety to be fun and the characters and pacing are a joy.
I still wish we could get some FF games like 7 or 9 where there is depth to equipment, magic and turn based combat, but jrpgs have been iterating away from complex battle systems and sell well so can't see them going back.
I still think FF7 was the pinnacle as material mixing and matching with equipment was really simple and super fun.
Anyway rnat over, FF16 is good, recommend it.
One point that stands out to me is that when you ask it for code it will give you an isolated block of code to do what you want.
In most real world use cases though you are plugging code into larger code bases with design patterns and paradigms throughout that need to be followed.
An experienced dev can take an isolated code block that does X and refactor it into something that fits in with the current code base etc, we already do this daily with Stackoverflow.
An inexperienced dev will just take the code block and try to ram it into the existing code in the easiest way possible without thinking about if the code could use existing dependencies, if its testable etc.
So anyway I don't see a problem with the tool, it's just like using Stackoverflow, but as we have seen businesses and inexperienced devs seem to think it's more than this and can do their job for them.
The challenges thst existed to use technology no longer exist, so there is no longer a reason to look under the hood for most people. It's like how a lot of generations after boomers don't know about how to change a tyre or spark plugs etc, cars got more reliable and industries created services to stop you needing to worry about that stuff.
As a kid I remember WANTING to play games with a friend on PC, he knew we needed a null modem cable and we went to pc shop 2 towns over got one and tried to figure out how to play together using it. Then when the Internet came out and we had to fight against Internet connection sharing so one computer could share Internet with friends pc. Trying to use no-cd patches just so we didn't need to keep grabbing cds to play games etc.
There were so many things you learnt back then but it was because we had no alternative, I get why tech knowledge has vanished and I don't blame them, they have had no need to solve the same problems and haven't grown with technology, it's been already established and they have had no need to concern themselves with it.
Problem is the working world still heavily needs PC skills and basic analytical ability so there needs to be more focus on those old "computer driving license" style courses so people can certify they know how to find a file and end task when something hangs.