• Please review our updated Terms and Rules here

Upgrading my main PC

falter

Veteran Member
Joined
Jan 22, 2011
Messages
6,575
Location
Vancouver, BC
I've no problem spending on vintage hardware, but when it comes to the current stuff I'm notoriously cheap and usually subsist on castoffs. However while doing my Altair video I finally reached the limits of my patience with the GTX 1650 I've had for 2 years - working on the timeline in Premiere with a 4k video, especially with animations and such, became an absolute battle, with the machine locking up every hour or so and the preview window crashing.

I haven't bought a brand new video card for myself since an Nvidia 570 I think. I had a credit with my distributor so that helped some. Man, what a stressful experience though. I didn't realize NVIDIA had gone over to the Dark Side in the minds of PC gamers. I found the multitude of options and price differences super confusing. I couldn't make myself spend almost two grand on a 4080, much less three or four on a 4090, so I went with a 4070Ti. A lot of reviewers have dumped on the 4070 as too expensive, not enough VRAM, but I felt kind of hemmed in because there was a substantial difference between this card and a 4060, and the next jump up (to a 4080) was another $800. From what I gathered via user benchmarks the 4080's real world performance was only about a third better than the Ti.

I did think about grabbing a second hand 3090... but the main place I see those is second hand on Facebook marketplace and they get snapped up long before I see them. And they're going for $1000, which is what the 4070Ti goes for. Benchmarks suggest the 4070Ti outperforms the 3090 by about 25-30%, although it has half the VRAM.

Anyway, I had to replace the PSU with an 850w, and it came with the new PCIe 5.0 header so I didn't need to adapt anything.. overall, the new card seems pretty decent. I can work with complex bits on my Premiere timeline and it's SO much smoother and less crash prone than before. I actually kept track of my hours doing the Altair video, it ended up being around 80 hours of work (it's earned $450 in ad revenue so far, about $5.60 per hour, my old dishwashing wage, lol). I would bet 25% of that was due to the horrifically slow speed of my rig. I did a test render of that video and it was a mere 45 minutes vs the previous 4.25 hours. That's awesome. Not sure why the HP Envy is beating it by ten minutes despite a 'lesser' card (that's another thing I learned with NVIDIA, newer isn't necessarily better sometimes).. I'm thinking the i9 is beating my Ryzen 5 2600.

Don't know if I made the 'right' choice, but this is definitely better than what I had before, no doubt at all. Not really sure what I should tackle next. I was looking at more powerful Ryzen chips but the performance difference compared to mine, if benchmarks are to be believed, aren't that massive.. like 30% faster for a Ryzen 9 5900X. That doesn't feel like a whole lot for the money spent.
 
Last edited:
Yeah.. some serious gamers are of the opinion that NVIDIA no longer cares about consumer graphics cards because of their success with AI, so they're just throwing out high prices and walking away.
 
This why I don't produce videos. I hate spending that much money on equipment, much less when it's just a hobby. I do have to buy trucks and forklifts once in a while and I usually buy used. But we buy stuff to make our jobs easier right? So I'm glad it is working better for you.

Since I don't make videos my rigs consist of old cast offs and I usually put Linux on them. I have listened to other folks who use Linux for their video production and use free open source software which lowers their costs somewhat. Isn't Premiere an Adobe software? Runs on Windows? Have you tried lowering costs with Linux? Of course the learning curve can be a pain in the butt and maybe the Premier software is just better at what you do, I don't know. I was just thinking that maybe it's similar to what we do by using Linux to delay hardware upgrades and licensing fees. But I'm not really sure if it works the same way in video production.

Anyway, I enjoy your videos even if they don't earn you as much as a dish washing job. :)

Seaken
 
Yeah.. some serious gamers are of the opinion that NVIDIA no longer cares about consumer graphics cards because of their success with AI, so they're just throwing out high prices and walking away.

Consumer graphics cards are a thing of the past to Nvidia, they are a rounding error on their financial sheets. In Q3 2023, Nvidia made 14.51B in data center sales for GPGPU and AI, compared to only 2.86B for "gaming". Much of that "gaming" category was Geforce NOW related, which is just GPUs in data centers streaming to gamers, rather than people with individual discrete video cards.

With all of that, it's no real surprise Nvidia is charging whatever they want. They have no real competition, and everything they're making is constantly selling to the walls, they have no incentive to compete on price. Gamers aren't the only ones they're price gouging, data centers are getting even worse price gouging, with some of Nvidia's cards costing well into the tens of thousands of dollars.
 
It gives a chance for AMD to catch up, or for Intel to make something decent. We do need another GPU maker out there but the cost to entry are pretty high.
 
The days of new players getting into the GPU market are long gone. Sure, you could be a fabless company and come up with a design, but you'll never get it to market.

GPU technology is patented from A-Z, you'd go bankrupt just licensing the technology, even if you don't plan on using it. Not only would you have to deal with Nvidia and AMD's lawyers, you'd have to deal with all of the patent troll holding companies that own all of the patents from old defunct video chip companies they snapped up decades ago.

You'd have to partner up with one of the old players that still has some relevance, like VIA or NEC and rework some of their technology to work in the PC again. VIA is probably the easiest since they developed their Chrome line of GPUs well into the DX9/10 era. The NEC PowerVR would be a second choice, but that'd require a lot more work.
 
It gives a chance for AMD to catch up, or for Intel to make something decent. We do need another GPU maker out there but the cost to entry are pretty high.
Iagree with Falter. I spend alot on vkntage gear but practically nothing on modern stuff. Mainly because modern computers bore me. There is no fun like there used to be.

But as far as AMD and intel I just bought my first new graphics card (gift for my son).

Its an intel ARC A380. It has good reviews.. and it was a great price.
 
Back
Top