Become a Patron!

What do/did you do for a living?

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
Last edited:

The Cromwell

I am a BOT
VU Donator
Diamond Contributor
Member For 4 Years
You want to set up a 2nd HDD to look just like the first? And have data read/written to to both drives as you use the PC, with that cpu?

This one?
https://acerrecertified.com/gateway...d-windows-8-sx2110g-uw24/#product-description
or
https://www.walmart.com/ip/Processo...way-AMD-8-E1-1500-Drive-SX2110G-UW23/24766511
The Wal Mart one.
I just want to make an image on the added drive. then remove the origional drive and replace it with the new drive and use it. Keep the origional one as a backup. not installed in PC.
If I recall correctly there is an unused SATA port in there.

Have only looked inside it once when cleaning it for dust and such.

do not really need a 1 TB drive only used 15 or 20 gig in the one that is in there. But they are so cheap.

I have a WD 1TB USB drive hooked to my router where I keep all my videos and such. Easy access that way to all my other computers. A crude NAS I suppose :)
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
The Wal Mart one.
I just want to make an image on the added drive. then remove the origional drive and replace it with the new drive and use it. Keep the origional one as a backup. not installed in PC.
If I recall correctly there is an unused SATA port in there.

Have only looked inside it once when cleaning it for dust and such.

do not really need a 1 TB drive only used 15 or 20 gig in the one that is in there. But they are so cheap.

I have a WD 1TB USB drive hooked to my router where I keep all my videos and such. Easy access that way to all my other computers. A crude NAS I suppose :)
In that case, you're a prime candidate. It won't cost more, maybe less. If your not going to use the space, then....
Besides, your going to have the drive that's in it for more space.
Having said that, I haven't got a recommend on cloning software because I don't want to be wrong for your need. Like if Windows doesn't want to boot on the new drive.
I found this interesting though:
youtube.com/watch?v=ioYPSsFnmNA

Crucial 120GB
Silicon Power 256GB
I have a USB to Sata adapter if you want to borrow it but if the PC has at least 1 more port you can use it instead.
 
Last edited:

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Hmm been looking at 1 TB drives on Amazon the WD Blue and Seagate Barracuda are the same price. $45. 1 TB is as big as I want as they still take forever to fill or copy esp if in a USB situation.
Well I haven't looked in recent years, but, last time I looked, the USB ones from Western Digital were all still using USB 2.0 whereas Seagate had already been using USB 3.0 for like ages. USB 3.0 harddrive performance for handling normal size or large media files and/or backup archives is pretty much completely on par with SATA3. Only if you plan on doing a lot of 4K read/write operations (such as running an OS on it) or if you need a high performance RAID array, that's when USB 3.0 isn't always great. Whenever I need more transfer speed I just plug in one of my EMTEC 256GB "S600 SpeedIN" USB 3.0 pen drives into my laptop and that's it. They use fast MLC memory so they can keep up with the M.2 internal SSD that's inside my laptop. Actually they're not as super fast as the fastest (and much more expensive) ones from SanDisk, but still fast enough to keep up with what I have inside my laptop so it doesn't matter to me anyway.

What does matter is it takes not more than just a couple of minutes to transfer a full Blu-ray Remux (.mkv file) that's something like 20-30GB on average. Use only MPC-HC with madVR to play them, with audio bitstreaming over HDMI set to enabled (if you, like me, have a decent AVR or prepro that can do Dolby TrueHD and DTS-HD MA). Intel Haswell (4xxx) CPUs and up can handle 23.976 Hz vertical refresh frequency on TV really smoothly... just go to Intel HD Graphics Control Panel and choose 23p Hz from there, and, after you're done watching the movie, change the setting back to 60p Hz or whatever it was before. I don't understand why so many people watch 23.976fps movies from the U.S. on a display that is still fixed at 60Hz. This ain't rocket science.
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Not sure if my hardware/bios would support the latest HDD or SSD throughput.
Also not sure about long term life of SSD's. They use memory which has a limited number of read/write cycles and do all kinds of manipulations to work around this problem.

And the final straw is they cost much more than a good HDD and I have no need for faster HDD anyway.
Now if I did video rendering and such much....
I also play no video games which often require mo power!
The main important reason why you want to run your OS and some certain software applications on an SSD is because it just makes the whole shebang so much more responsive in almost ANY situation, and also the PC boots up a lot faster. It doesn't have to be the fastest, most expensive SSD... even the cheapest (under $500) of mid-priced laptops worth their salt now either come with an M.2 internal SSD of at least 256GB, or come with Intel Optane caching technology.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
The main important reason why you want to run your OS and some certain software applications on an SSD is because it just makes the whole shebang so much more responsive in almost ANY situation, and also the PC boots up a lot faster. It doesn't have to be the fastest, most expensive SSD... even the cheapest (under $500) of mid-priced laptops worth their salt now either come with an M.2 internal SSD of at least 256GB, or come with Intel Optane caching technology.
You're right about that... A standard SATA SSD will suffice for many / if not all situations. It's only when you want data security and redundancy will you start delving into shelling out more for the more ... 'intricate' SSD's. And since SSD life has damn near tripled (from where it was a few years ago) there's really no need to panic about life expectancy. So long as you're not data thrashing the drive itself. DATA THRASHING - Adding and erasing information continuously for long periods of time. No matter how large or how small the information is, doing so can kill an SSD prematurely.

When you start talking the fastest SSD available, you're getting into m.2 territory. As those bastards aren't limited by SATA speed constraints, they're limited by BUS speeds.

As for what I do for a living. I'm a wrench turning, coworker intimidating, boss shaming engineer. I work 3rds so... Engineer by night, sleeping by say... And dad somewhere in the afternoon. Not sure though since days start running together and weeks feel like months.

Sent from my Pixel 3 XL using Tapatalk
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
Even though 10 years ago the 1TB WD Caviar Black WD1002FAEX used to be a highly popular choice back then, the 1TB Samsung Spinpoint F3 HD103SJ proved to be both cheaper and better than it.
Yep, that's why I said..
3.5" drives I've used and have worked very good for 10+ years:
WD Black 7200
Seagate Barracuda 7200
Samsung Spinpoint 7200
I prefer the Barracuda or Spinpoint over the Black as they seem to run smoother/quieter/faster.
I was actually gifted the new black 750GB for specking someone a build.
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
You're right about that... A standard SATA SSD will suffice for many / if not all situations. It's only when you want data security and redundancy will you start delving into shelling out more for the more ... 'intricate' SSD's. And since SSD life has damn near tripled (from where it was a few years ago) there's really no need to panic about life expectancy. So long as you're not data thrashing the drive itself. DATA THRASHING - Adding and erasing information continuously for long periods of time. No matter how large or how small the information is, doing so can kill an SSD prematurely.

When you start talking the fastest SSD available, you're getting into m.2 territory. As those bastards aren't limited by SATA speed constraints, they're limited by BUS speeds.

As for what I do for a living. I'm a wrench turning, coworker intimidating, boss shaming engineer. I work 3rds so... Engineer by night, sleeping by say... And dad somewhere in the afternoon. Not sure though since days start running together and weeks feel like months.

Sent from my Pixel 3 XL using Tapatalk
The M.2 SSD in my laptop is just a SK Hynix HSF256G39TND-N210A. Nothing too special, but it still gets the job done nonetheless.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
The M.2 SSD in my laptop is just a SK Hynix HSF256G39TND-N210A. Nothing too special, but it still gets the job done nonetheless.
But I do more than just keep my operating system stored on my solid states I keep my video games and some of my media stored on solid state drives. I was told storing video games on solid state drives was a bad idea just because a lot of those bits and pieces of software have constant updates, but when I go when I start looking at the memory units actually on the drive I don't see a whole lot of damage there now there are some blocks that are unwritable, but s*** for 5-year-old drives they're still running I can't really complain.

Sent from my Pixel 3 XL using Tapatalk
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
Any oem desktop pc is terrible compared to how the money could be spent diy.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
Oh yeah, you can spend $2000+ on a CPU and GPU alone. Let alone the rest of the hardware to power it all. A good main board, the most bleeding edge of RAM, PSU, storage devices, case, and cooler.

The upside, you'll become intimately familiar with your system, component manufacturer, and of course you'll likely own the system for longer periods of time than a prebuilt system.

Sent from my Pixel 3 XL using Tapatalk
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
Even for a $300 machine its better to diy. You'll end up with better, and always have one that can be upgraded at any time. Not the proprietary crap.
 

The Cromwell

I am a BOT
VU Donator
Diamond Contributor
Member For 4 Years
Yep DIY and make all the mistakes about the wrong graphics card because of the PC board slot being different. or one of many other things to battle thru. Different power supply connectors, have to pick up adapters to get the drive you purchased to mount in the case you purchased.
Figure out where all the leads from the case go on the mother board...
Lots simpler to DIY ejuice.
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
Yep DIY and make all the mistakes about the wrong graphics card because of the PC board slot being different. or one of many other things to battle thru. Different power supply connectors, have to pick up adapters to get the drive you purchased to mount in the case you purchased.
Figure out where all the leads from the case go on the mother board...
Lots simpler to DIY ejuice.
specking all the parts is hardest. when done correctly everything goes together easily since they only fit in one place/way.
 

The Cromwell

I am a BOT
VU Donator
Diamond Contributor
Member For 4 Years
specking all the parts is hardest. when done correctly everything goes together easily since they only fit in one place/way.
Unless like me you find out that Win 7 has no drivers for the video board you bought.

imho anyone who does not have to return or odrer additional stuff making their own PC has done it quite a bit. for a first timer it is quite daunting.

I have build about 4 PC's from scratch over the years. And reapied and upgraded quite a few for myself and family and friends, etc
And it is much more complicated now than it used to be. So many more variants.

For the vast majority of users buy one put together and use it.

You like to tinker and scratch your head wondering why your bios won't work with that other part. Way all the screws for the mother board won't fit right in the case you bought, or not quite enough room for the water cooled heat sink and you power supply, etc.

Yep DIY.
 
Last edited:

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
But I do more than just keep my operating system stored on my solid states I keep my video games and some of my media stored on solid state drives. I was told storing video games on solid state drives was a bad idea just because a lot of those bits and pieces of software have constant updates, but when I go when I start looking at the memory units actually on the drive I don't see a whole lot of damage there now there are some blocks that are unwritable, but s*** for 5-year-old drives they're still running I can't really complain.

Sent from my Pixel 3 XL using Tapatalk
Storing a huge collection of big media files entirely on SSD is way too expensive IMO, but SSD is completely silent because there are no moving parts so copying only the files you need from a (bunch of) USB 3.0 harddrive(s) onto a cheap-ish SSD before playback can be a great option to consider, as it eliminates the need for a separate room with NAS and also it avoids the comparatively high cost of NAS. That is, at least if you want to rid the hi-rez audio experience of background noise while at the same time also freeing up some extra budget for better audio/video quality. The only real downside is before you start the playback you need to wait till the copying is done and you also need to know how to force the harddrive(s) to sleep─and to stay asleep.
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
Unless like me you find out that Win 7 has no drivers for the video board you bought.

imho anyone who does not have to return or odrer additional stuff making their own PC has done it quite a bit. for a first timer it is quite daunting.

I have build about 4 PC's from scratch over the years. And reapied and upgraded quite a few for myself and family and friends, etc
And it is much more complicated now than it used to be. So many more variants.

For the vast majority of users buy one put together and use it.

You like to tinker and scratch your head wondering why your bios won't work with that other part. Way all the screws for the mother board won't fit right in the case you bought, or not quite enough room for the water cooled heat sink and you power supply, etc.

Yep DIY.
You must mean that for that vc there are no W7 compatible drivers. Possible but doubtful and I think you mean this as example of what might happen during novice diy. If specked correctly there are no issues with things working or not fitting.
Agree with first time daunting.
For anyone planning on using a PC for the rest of their life, I'd always recommend diy, unless they just don't mind overpaying for inferiority or for some reason just aren't able to grasp the meanings. Anyone can get a pc specked for their use, learn why for those specs, as well as guidance for assembly thru boot-up, for zero cost.
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Oh yeah, you can spend $2000+ on a CPU and GPU alone. Let alone the rest of the hardware to power it all. A good main board, the most bleeding edge of RAM, PSU, storage devices, case, and cooler.

The upside, you'll become intimately familiar with your system, component manufacturer, and of course you'll likely own the system for longer periods of time than a prebuilt system.

Sent from my Pixel 3 XL using Tapatalk
For years I kept wanting to build one for ~$2500. I'm glad I never pulled the trigger on that. Because, these days I much prefer to pull on something that looks a bit more like this:

goat_v2.jpg
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Unless like me you find out that Win 7 has no drivers for the video board you bought.

imho anyone who does not have to return or odrer additional stuff making their own PC has done it quite a bit. for a first timer it is quite daunting.

I have build about 4 PC's from scratch over the years. And reapied and upgraded quite a few for myself and family and friends, etc
And it is much more complicated now than it used to be. So many more variants.

For the vast majority of users buy one put together and use it.

You like to tinker and scratch your head wondering why your bios won't work with that other part. Way all the screws for the mother board won't fit right in the case you bought, or not quite enough room for the water cooled heat sink and you power supply, etc.

Yep DIY.
My biggest pet peeve always was that every decent motherboard had the 2nd PCIe x16 slot located too close to the 1st one so that putting two doublesize aircooled vidcards in SLI would cause the 2nd vidcard's PCB to block too much of the airflow intake of the 1st one, and I didn't want to go for watercooling. Next, Asus came with a workstation motherboard that moved these two slots farther apart, but the CPU socket on that one was located too close to the 1st PCIe x16 slot so that the Noctua NH-D14 (huge size aftermarket aircooler) couldn't fit on the Intel CPU... that's how I always kept ending up ordering nothing, time after time after time until finally I just kind of grew tired of it all, and, around the same time also I was gradually losing interest in gaming because practically all the latest games were boring like watching paint dry, and IMO these days they still are.
 
Last edited:

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
You must mean that for that vc there are no W7 compatible drivers. Possible but doubtful and I think you mean this as example of what might happen during novice diy. If specked correctly there are no issues with things working or not fitting.
Agree with first time daunting.
For anyone planning on using a PC for the rest of their life, I'd always recommend diy, unless they just don't mind overpaying for inferiority or for some reason just aren't able to grasp the meanings. Anyone can get a pc specked for their use, learn why for those specs, as well as guidance for assembly thru boot-up, for zero cost.
I started to just buy decently priced average laptops instead. A family member keeps the annoying habit of causing inadvertent power outages so that mainly is why... it's like having an uninterruptable power supply (UPS) conveniently on the cheap.
 

The Cromwell

I am a BOT
VU Donator
Diamond Contributor
Member For 4 Years
I started to just buy decently priced average laptops instead. A family member keeps the annoying habit of causing inadvertent power outages so that mainly is why... it's like having an uninterruptable power supply (UPS) conveniently on the cheap.
A decent UPS is a good investment. I have an old APC 300W for the computer desk and one for the TV/entertainment corner.
Had them for over 20 years and just replace the batteries every 3-5 years.
 

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
I swapped a hdd for an ssd once, and that or add memory is about all I'd do with a laptop.
I bought an infinity edge touch screen laptop when they first came out and have barely used it in last 2 yrs.
I'm using a desktop I built close to 3 yrs ago. I overbuilt it, but plan on doing nothing but routine maintenance for years to come unless something starts to fail.
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
I swapped a hdd for an ssd once, and that or add memory is about all I'd do with a laptop.
I bought an infinity edge touch screen laptop when they first came out and have barely used it in last 2 yrs.
I'm using a desktop I built close to 3 yrs ago. I overbuilt it, but plan on doing nothing but routine maintenance for years to come unless something starts to fail.
Each time when I buy a new laptop I just make sure the specs are good enough to get me through the next 3 years without requiring any upgrades, and that it has a 3-year warranty so I just replace the whole laptop each time, usually around the time when it expires, when a solid bargain pops up. No touch screen for me, just a normal 15″ 1080p LCD to keep the price down low enough because I barely ever use the screen anyway... that's what the TV is for. One thing I like about laptops is I can continue to surf the web by hooking it up to my smartphone via USB or via WiFi after the power goes out again, and can do so without having to stay near the TV where there's simply no space to go sit in front of a smaller screen that has a UPS attached to both a desktop PC and it. Lugging a separate screen plus a UPS with a desktop PC attached to it to another room, finding space to put it there, and then lugging everything all the way back to the TV again one hour after that is just too much hassle IMO─if at all possible, that is.
 

The Cromwell

I am a BOT
VU Donator
Diamond Contributor
Member For 4 Years
Each time when I buy a new laptop I just make sure the specs are good enough to get me through the next 3 years without requiring any upgrades, and that it has a 3-year warranty so I just replace the whole laptop each time, usually around the time when it expires, when a solid bargain pops up. No touch screen for me, just a normal 15″ 1080p LCD to keep the price down low enough because I barely ever use the screen anyway... that's what the TV is for. One thing I like about laptops is I can continue to surf the web by hooking it up to my smartphone via USB or via WiFi after the power goes out again, and can do so without having to stay near the TV where there's simply no space to go sit in front of a smaller screen that has a UPS attached to both a desktop PC and it. Lugging a separate screen plus a UPS with a desktop PC attached to it to another room, finding space to put it there, and then lugging everything all the way back to the TV again one hour after that is just too much hassle IMO─if at all possible, that is.

but the UPS will also protect your laptop charger and such.
Also good surge protectors. They actually do not have to be attached to the device being protected.
You house is basically 2 110v parallel circuits so a surge supressor anyplace will help things all over.
Actually easier if you are in the UK or someplace where it is 220 because it is just one circuit.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
Considering that almost every GPU nowadays is PCI-E, the choice shouldn't be too difficult. That and finding the PROPER PSU to POWER IT. Which I find people tend to skimp on (or try to upgrade a card that requires 350-500W of power with a 300-350W PSU), and when they do, the see tragic consequences. You wouldn't allow a 9V battery to try and start your car, so why would you even CONSIDER underpowering your PC or it's components?

Picking the correct board and CPU is the pits though, due to socket type / CPU compatibility and CHIPSET / RAM compatibility. For instance, when I priced my newest setup OMEGA, I went Intel (as I usually do, due to driver's and software issues that had plagued AMD for years now - granted, they're getting better, but I'd rather not risk it), I chose the best board for my CPU and memory speed possible. Fine print - read it. The socket type is the 2011 meaning 2011 contacts on the board and CPU. The boards chipset is compatible with most of not all current day CPUs and currently capable of running the fastest frequency obtainable on current day DDR-4. Does that mean it wont work with higher frequencies or faster CPUs? No, it may need a bit of intervention on the end user to do so, but why would you mess with it of the current firmware is already set up and ready to go?

As for Windows 7 driver incompatibility, I'm not sure what age of hardware you were running or why, but in all the years I've built systems, driver incompatibility has never been an issue under any circumstance (unless I was running a 3D GPU only in a 2D enviro - IE running an nVidia Quattro under Windows 98 or lower which in that case, it wasn't even feasible) with the exception of AMD driver's or some of nVidias BETA drivers.

Sent from my Pixel 3 XL using Tapatalk
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
My biggest pet peeve always was that every decent motherboard had the 2nd PCIe x16 slot located too close to the 1st one so that putting two doublesize aircooled vidcards in SLI would cause the 2nd vidcard's PCB to block too much of the airflow intake of the 1st one, and I didn't want to go for watercooling. Next, Asus came with a workstation motherboard that moved these two slots farther apart, but the CPU socket on that one was located too close to the 1st PCIe x16 slot so that the Noctua NH-D14 (huge size aftermarket aircooler) couldn't fit on the Intel CPU... that's how I always kept ending up ordering nothing, time after time after time until finally I just kind of grew tired of it all, and, around the same time also I was gradually losing interest in gaming because practically all the latest games were boring like watching paint dry, and IMO these days they still are.
That's to make you add a bridged water block on both GPUs. Custom water loops, ahhhh, I love them so much. However, what I don't love about them is when the pump fails. LOL! Then You have to order a new pump, and yeah... That can be a pain in the ass altogether. Plus, SLI / CROSSFIRE isn't really that much of a performance increase. I have seen gains in the graphics field as high as 10-20% when running 2+ cards in SLI / CROSSFIRE. It was a cool idea, but when you see how it performs and you begin seeing it's downsides vs it's benefits, you tend to not really mess with it.

I built a system (AKA- SKULLTRAIL) once that had 4 GPUs in it at the time the top of the line card was an 8800Ultra ... I had nothing but problems from it. I tested both the XEON CPUs as well as a single Core 2 Extreme edition, and the framerates were just terrible. I gained nothing because sharing bandwidth across all PCI-E lanes can sometimes be a bad thing. I tested SLI once again on my current system with dual GTX 690s and again with GTX 1080Ti's, the performance increase was minimal at best. It wasn't until I went and started tuning VRAM AND GPU clock speeds that I began noticing any improvements. Sadly, the whole 'working in sync' sounds good on paper, but in practical application, doesn't work as one might assume. Currently content with running my single RTX2080 Ti, and although not a huge leap from the 1080Ti it's still faster overall and yes, I have that overclocked as well.

I could see running them separately for media reasons, but in SLI , not a chance.

Sent from my Pixel 3 XL using Tapatalk
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
but the UPS will also protect your laptop charger and such.
Also good surge protectors. They actually do not have to be attached to the device being protected.
You house is basically 2 110v parallel circuits so a surge supressor anyplace will help things all over.
Actually easier if you are in the UK or someplace where it is 220 because it is just one circuit.
It's all 220 here in Belgium, but houses are a lot smaller than in the U.S. so I simply have no space to put a UPS anywhere outside my own room, at least not without me getting physically attacked by family member so I just let family member physically attack the electrics instead... it actually helps save on my medication. And besides, I have no doubt in my mind that family member would still figure a way to fuck the UPS up like semi on purpose.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
I thought it was the freak come out at night. Not the geeks. Sheesh LOL
Yeah, even some nerds like myself have to come out at night. I have to sleep during the day so...

I go to bed normally around 1400 and am back at it by 2045. You know how things go.

Sent from my Pixel 3 XL using Tapatalk
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Considering that almost every GPU nowadays is PCI-E, the choice shouldn't be too difficult. That and finding the PROPER PSU to POWER IT. Which I find people tend to skimp on (or try to upgrade a card that requires 350-500W of power with a 300-350W PSU), and when they do, the see tragic consequences. You wouldn't allow a 9V battery to try and start your car, so why would you even CONSIDER underpowering your PC or it's components?

Picking the correct board and CPU is the pits though, due to socket type / CPU compatibility and CHIPSET / RAM compatibility. For instance, when I priced my newest setup OMEGA, I went Intel (as I usually do, due to driver's and software issues that had plagued AMD for years now - granted, they're getting better, but I'd rather not risk it), I chose the best board for my CPU and memory speed possible. Fine print - read it. The socket type is the 2011 meaning 2011 contacts on the board and CPU. The boards chipset is compatible with most of not all current day CPUs and currently capable of running the fastest frequency obtainable on current day DDR-4. Does that mean it wont work with higher frequencies or faster CPUs? No, it may need a bit of intervention on the end user to do so, but why would you mess with it of the current firmware is already set up and ready to go?

As for Windows 7 driver incompatibility, I'm not sure what age of hardware you were running or why, but in all the years I've built systems, driver incompatibility has never been an issue under any circumstance (unless I was running a 3D GPU only in a 2D enviro - IE running an nVidia Quattro under Windows 98 or lower which in that case, it wasn't even feasible) with the exception of AMD driver's or some of nVidias BETA drivers.

Sent from my Pixel 3 XL using Tapatalk
I agree... apart from that 1st sentence. Almost every GPU nowadays is baked into a phone or gaming console. :D
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
I agree... apart from that 1st sentence. Almost every GPU nowadays is baked into a phone or gaming console. :D
Well, PC-wise, almost every GPU is a PCI-E card. LOL. Oh the days of ISA, PCI, and AGP. AGP slots and their goofy ass BUS speeds. You had AGP, AGP 2X, AGP 4X, and AGP 8X... Slot type stayed the same (roughly) they just increased BUS speeds on it. Then some crazy ass got crazy and said 'Hey guys, since PCI is kind of dated, let's make a revised version of it and call it PCI-E... And let's release 1800 versions of it. So now we have PCI-E 1, 2, 16 and now we're seeing 16x2... Like... Calm down crazy ass, just call it PCI-E 32 or 64 or 128 or 256, etc. Stop confusing people and just stick to a damn socket / slot type. Fucking hell.

AMD is notorious for that shit... You have socket AM2, AM2+, AM3, AM3+, AM4, AM4+ etc... Just... Pick a God damn socket and stick to it!

And yes, I bust AMDs ass hard... I used to be a huge fan of AMD and ATI / AMD before I was ever a fan of Intel and nVidia. And before then, I was a fan of CYRIX and 3Dfx. I know, kind of dated myself... But who cares.

Sent from my Pixel 3 XL using Tapatalk
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
That's to make you add a bridged water block on both GPUs. Custom water loops, ahhhh, I love them so much. However, what I don't love about them is when the pump fails. LOL! Then You have to order a new pump, and yeah... That can be a pain in the ass altogether. Plus, SLI / CROSSFIRE isn't really that much of a performance increase. I have seen gains in the graphics field as high as 10-20% when running 2+ cards in SLI / CROSSFIRE. It was a cool idea, but when you see how it performs and you begin seeing it's downsides vs it's benefits, you tend to not really mess with it.

I built a system (AKA- SKULLTRAIL) once that had 4 GPUs in it at the time the top of the line card was an 8800Ultra ... I had nothing but problems from it. I tested both the XEON CPUs as well as a single Core 2 Extreme edition, and the framerates were just terrible. I gained nothing because sharing bandwidth across all PCI-E lanes can sometimes be a bad thing. I tested SLI once again on my current system with dual GTX 690s and again with GTX 1080Ti's, the performance increase was minimal at best. It wasn't until I went and started tuning VRAM AND GPU clock speeds that I began noticing any improvements. Sadly, the whole 'working in sync' sounds good on paper, but in practical application, doesn't work as one might assume. Currently content with running my single RTX2080 Ti, and although not a huge leap from the 1080Ti it's still faster overall and yes, I have that overclocked as well.

I could see running them separately for media reasons, but in SLI , not a chance.

Sent from my Pixel 3 XL using Tapatalk
I was talking more about the era that went on right after the i7-2700K and after the i7-4770K, around the time when 120Hz screens were the latest hype. The Nvidia SLI was almost like a strict necessity for that, as it is the minimum fps, not the average fps, that makes all the real-world difference in (hardcore) gaming... same reason I never had any interest whatsoever in AMD Crossfire, it was always synonymous to poor minimum fps, and that's also in addition to all the usual AMD driver issues that you mention. Custom water loops are great, but if the pump fails you not only have to order a new pump, but also you risk losing one or more expensive GPUs (and maybe the CPU as well) or risk losing OC stability. That plus the fact the custom waterblocks would almost never fit on the next generation, and, the cost would be already relatively too high to begin with. Remember $2500 in this particular part of Europe doesn't buy you the same heavy chunk of hardware that you guys could get in the U.S. everywhere.

I know the RTX2080 Ti is a spiffing GPU. It's just that I can't see myself gaming anymore these days... call it faded glory if you want.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
I was talking more about the era that went on right after the i7-2700K and after the i7-4770K, around the time when 120Hz screens were the latest hype. The Nvidia SLI was almost like a strict necessity for that, as it is the minimum fps, not the average fps, that makes all the real-world difference in (hardcore) gaming... same reason I never had any interest whatsoever in AMD Crossfire, it was always synonymous to poor minimum fps, and that's also in addition to all the usual AMD driver issues that you mention. Custom water loops are great, but if the pump fails you not only have to order a new pump, but also you risk losing one or more expensive GPUs (and maybe the CPU as well) or risk losing OC stability. That plus the fact the custom waterblocks would almost never fit on the next generation, and, the cost would be already relatively too high to begin with. Remember $2500 in this particular part of Europe doesn't buy you the same heavy chunk of hardware that you guys could get in the U.S. everywhere.

I know the RTX2080 Ti is a spiffing GPU. It's just that I can't see myself gaming anymore these days... call it faded glory if you want.
Yeah. I think people overhyped SLI and refresh rates on monitors, forcing people to invest in unnecessary hardware just for a minor gain here and there. Now that you mention it, average framerates at any setting on any hardware should be about 60FPS with a 1ms response time. At that time, monitors, regardless of refresh rate (120hz+) were and most still are, unable to achieve 120hz+ with a 1ms response time.

Or maybe my brain is trailing off

Sent from my Pixel 3 XL using Tapatalk
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Yeah. I think people overhyped SLI and refresh rates on monitors, forcing people to invest in unnecessary hardware just for a minor gain here and there. Now that you mention it, average framerates at any setting on any hardware should be about 60FPS with a 1ms response time. At that time, monitors, regardless of refresh rate (120hz+) were and most still are, unable to achieve 120hz+ with a 1ms response time.

Or maybe my brain is trailing off

Sent from my Pixel 3 XL using Tapatalk
Yes of course it's all been largely overhyped. But the whole idea behind 120Hz (and 144Hz) is that, on a traditional 60Hz screen, even just a single duped frame is still (very) highly noticeable in the motion fluidity, or smoothness... in a fast-paced first person shooter it tends to be distracting, even disorienting at times, so not really overhyped in this regard. But the games themselves typically are overhyped like no tomorrow, at least IMO and, I'm not just talking about the flood of poorly ported console games, but also the piss poor gameplay in dedicated PC games as well, the old rehash factor and boring DLC content─not to mention all the cheating and the pay-to-win like mentality that's been dominating all online multiplayer games... all bling and no true competition, and the controls suck anyway because they're convoluted, there's just too many different actions to assign to the mouse buttons and all across the whole keyboard.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
Yes of course it's all been largely overhyped. But the whole idea behind 120Hz (and 144Hz) is that, on a traditional 60Hz screen, even just a single duped frame is still (very) highly noticeable in the motion fluidity, or smoothness... in a fast-paced first person shooter it tends to be distracting, even disorienting at times, so not really overhyped in this regard. But the games themselves typically are overhyped like no tomorrow, at least IMO and, I'm not just talking about the flood of poorly ported console games, but also the piss poor gameplay in dedicated PC games as well, the old rehash factor and boring DLC content─not to mention all the cheating and the pay-to-win like mentality that's been dominating all online multiplayer games... all bling and no true competition, and the controls suck anyway because they're convoluted, there's just too many different actions to assign to the mouse buttons and all across the whole keyboard.
I understand that, and to some, it may be noticable. It does to some extent help even out frame rates, that I do not deny, however... There are still limitations to how much it truly helps. Bottlenecks aside, not all GPU, CPU, RAM or VRAM is created equal. Which is a huge reason why you should buy matched pairs... But wait, graphics cards don't have matched pairs, so how can you... You can't. Hence why you can have 2 identical cards, from the same manufacturer, run at 2 totally different speeds.

Take my 1080 Ti's for instance, they both ran at similar frequencies and speeds, yet one on slot A always ran 40-100mhz behind the one on slot B, and at first I though maybe it was a PCI-E issue, so I switched them around... The same card ran 40-100mhz slower still. I mean, it still ran within it's stated spec, but it was always behind. The manufacturing dates were on point, so what was it? I'm still unsure. Now that card overclocked like a champ, whereas the other card, meh not so much. Firmware was identical as well. Both ran the same firmware, coded on the same date, and even to double check my work, I reflashed both cards with the same firmware on the same day to no avail. That's why I said earlier, sometimes multiple cards are bad deals. Not to mention the various types of SLI available. You have software based and hardware based. You can put the bridge on or keep the bridge off. But again, that's delving into something that the average user doesn't want to mess with. On top of that, the average person doesnt want to mess with frequencies, voltages, timings, etc of their hardware. Most people want to take it out, install it, and if it works, it works, leave it alone. Then you got silicon junkies like me, that rip shit out of boxes, drop it in, go to the BIOS, and start fucking with vcores, vdroops, multipliers, CAS, t-RAS, t-RCD, and tRP limits. To squeeze ever last drop of performance out of my hardware without doing instant permanent damage to it. Which 8/10 times, I'm successful. There's that 1 or 2 times that I volt too high or too low... And bang! Dead shit.

But why someone would need multiple cards in the first place is beyond me. Again, unless you're running a multimedia platform and you just need a card to handle encoding / decoding or unless you're running a multi-monitor setup, that's all good, but to run them in SLI, honestly, is a waste of time, money, and valuable case space. LOL.

Sent from my Pixel 3 XL using Tapatalk
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
Well, PC-wise, almost every GPU is a PCI-E card. LOL. Oh the days of ISA, PCI, and AGP. AGP slots and their goofy ass BUS speeds. You had AGP, AGP 2X, AGP 4X, and AGP 8X... Slot type stayed the same (roughly) they just increased BUS speeds on it. Then some crazy ass got crazy and said 'Hey guys, since PCI is kind of dated, let's make a revised version of it and call it PCI-E... And let's release 1800 versions of it. So now we have PCI-E 1, 2, 16 and now we're seeing 16x2... Like... Calm down crazy ass, just call it PCI-E 32 or 64 or 128 or 256, etc. Stop confusing people and just stick to a damn socket / slot type. Fucking hell.

AMD is notorious for that shit... You have socket AM2, AM2+, AM3, AM3+, AM4, AM4+ etc... Just... Pick a God damn socket and stick to it!

And yes, I bust AMDs ass hard... I used to be a huge fan of AMD and ATI / AMD before I was ever a fan of Intel and nVidia. And before then, I was a fan of CYRIX and 3Dfx. I know, kind of dated myself... But who cares.

Sent from my Pixel 3 XL using Tapatalk
Think you forgot to mention the VESA Local Bus, and, especially, the Opti2 Local Bus. Opti2 was the bees knees so that's what I got.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
Think you forgot to mention the VESA Local Bus, and, especially, the Opti2 Local Bus. Opti2 was the bees knees so that's what I got.
Opti2 sounds familiar... But Lord know anymore. VESA, yes... That I did forget. My bad.

Sent from my Pixel 3 XL using Tapatalk
 

Carambrda

Platinum Contributor
ECF Refugee
Member For 5 Years
I understand that, and to some, it may be noticable. It does to some extent help even out frame rates, that I do not deny, however... There are still limitations to how much it truly helps. Bottlenecks aside, not all GPU, CPU, RAM or VRAM is created equal. Which is a huge reason why you should buy matched pairs... But wait, graphics cards don't have matched pairs, so how can you... You can't. Hence why you can have 2 identical cards, from the same manufacturer, run at 2 totally different speeds.

Take my 1080 Ti's for instance, they both ran at similar frequencies and speeds, yet one on slot A always ran 40-100mhz behind the one on slot B, and at first I though maybe it was a PCI-E issue, so I switched them around... The same card ran 40-100mhz slower still. I mean, it still ran within it's stated spec, but it was always behind. The manufacturing dates were on point, so what was it? I'm still unsure. Now that card overclocked like a champ, whereas the other card, meh not so much. Firmware was identical as well. Both ran the same firmware, coded on the same date, and even to double check my work, I reflashed both cards with the same firmware on the same day to no avail. That's why I said earlier, sometimes multiple cards are bad deals. Not to mention the various types of SLI available. You have software based and hardware based. You can put the bridge on or keep the bridge off. But again, that's delving into something that the average user doesn't want to mess with. On top of that, the average person doesnt want to mess with frequencies, voltages, timings, etc of their hardware. Most people want to take it out, install it, and if it works, it works, leave it alone. Then you got silicon junkies like me, that rip shit out of boxes, drop it in, go to the BIOS, and start fucking with vcores, vdroops, multipliers, CAS, t-RAS, t-RCD, and tRP limits. To squeeze ever last drop of performance out of my hardware without doing instant permanent damage to it. Which 8/10 times, I'm successful. There's that 1 or 2 times that I volt too high or too low... And bang! Dead shit.

But why someone would need multiple cards in the first place is beyond me. Again, unless you're running a multimedia platform and you just need a card to handle encoding / decoding or unless you're running a multi-monitor setup, that's all good, but to run them in SLI, honestly, is a waste of time, money, and valuable case space. LOL.

Sent from my Pixel 3 XL using Tapatalk
No I mean two Nvidia cards in (bridged) SLI mode were truly needed back in the day for continuous sustained motion smoothness in most GPU intensive games. The 120Hz was to make the (still unavoidable, excepting only in the much less power hungry games...) fps drops appear a lot less judder-like, as the duped frames would remain visible on screen for only half the period of time when compared to traditional 60Hz panels, but... for a 120Hz panel to be worth the added cost plus some sacrifices in other picture quality related attributes, the minimum fps needed to go up because that in fact is what reduces the fps drops themselves anyway in the first place─and does so merely by definition─so making them less noticeable on screen is only the final step in the strategy, albeit nowadays a single RTX2080 Ti will be already sufficient.
 
Last edited:

Vape Fan

_evil twin_
Staff member
Senior Moderator
VU Donator
Platinum Contributor
Press Corps
Member For 5 Years
AMD is notorious for that shit... You have socket AM2, AM2+, AM3, AM3+, AM4, AM4+ etc... Just... Pick a God damn socket and stick to it!
At least AMD back then was better at board/cpu backwards compatibility than intel. You could upgrade to newer generation of cpu w/o changing the board. Not so with intel, they make you buy a new board every time.

AMD was the king up until Bulldozer/FX that under performed, then intel took the lead in cpu's.
 

MyMagicMist

Diamond Contributor
ECF Refugee
Member For 5 Years
AMD was the king up until Bulldozer/FX that under performed, then intel took the lead in cpu's.

Still do not trust Intel due to all that "v-chip" backdoor access hub bub. Will stick to AMD even if they have "backdoor" access, known or unknown. I know a bunch of "nobody/s" access my computer when I install stuff using synaptic/dpkg/aptitude/apt-get. I note though these are usually the programmers/developers via installation scripts. They are in that case those I list as "Caspers", "friendly ghosts" who setup software.

I also have "hardened" my copy of Debian buster, use programs called FAM (file access monitor), tripwire, root kit hunter and set up ant-fork bombing protection. I also refuse to use aliases as the computer likes to play games with aliases. I don't want to type dir meaning ls -alf and have the operating system do rm -rf / for example.
 

casketweaver

Bronze Contributor
Member For 4 Years
ECF Refugee
At least AMD back then was better at board/cpu backwards compatibility than intel. You could upgrade to newer generation of cpu w/o changing the board. Not so with intel, they make you buy a new board every time.

AMD was the king up until Bulldozer/FX that under performed, then intel took the lead in cpu's.

The keyword... Were. AMD is still the king right now as far as core count / die size. I just won't invest in them anymore due to past issues with them. Drivers, thermals, and general instability to name a few. Then again, I'm someone that's not too happy with Intel either. We really need a third contender in that arena.

Sent from my Pixel 3 XL using Tapatalk
 

gopher_byrd

Cranky Old Fart
VU Donator
Diamond Contributor
ECF Refugee
Member For 5 Years
VU Patreon
I'm a RF Field Engineer working on Smart Grid technology for utilities. Our company makes AMI (remote read) electric, gas, and water meters as well as the network gear for those to work. Part of that is Distribution Automation for remotely controlling the utility's electric distribution network and power sensing. We also make remote controlled street light controllers. My job is to find suitable poles for the network gear and then make everything work.
 

VU Sponsors

Top