Bitcoin Miners VS DIY PC Enthusiasts

Recently the temperature sensors on my current graphics card have died. A good way to tell this is to look at the readings on your device's software (i.e. AMD Catalyst). If the temperature in the overclock section reads something like, say, -128 Celsius, that usually means the sensors are screwed. As a result, the fan now runs full-blast regardless of activity, meaning the fan will likely burn out sooner or later. I have tried to fix this with utilities such as MSI AfterBurner to no avail.

I've had to replace a lot of things in this old Dell XPS 8300. RAM, power supplies (I've gone through 3 in the last seven years), and even my current graphics card was an upgrade from the crappy stock one that came with the pre-built tower.

Right now I'm doing a cost analysis on three options: Replace the graphics card, build a new tower from scratch, and buy another pre-built PC.

The biggest trouble with replacing the graphics card is finding one the motherboard will actually work with (Dell's hardware doesn't allow upgrading to hardware they don't support. Assholes.). The tower itself is seven years old and I recently had to replace my monitor due to the fact that it was simply crapping out.

I'm also looking into building a new tower from scratch, but the prices of hardware are pretty insane, even for hardware that isn't current gen. The reason for this? Bitcoin miners.

A fellow classmate told me that miners have been buying hardware like crazy to build mining rigs, driving up prices. Searches on Newegg and Amazon can confirm this.

If any other PC enthusiasts can offer advice, I'd appreciate it. Either way, I suspect a lot of money is about to be spent.

Share
up
0 users have voted.

Comments

dervish's picture

I was under the impression that you'd have to have a source of nearly free power in order to even consider it.

up
0 users have voted.

"Obama promised transparency, but Assange is the one who brought it."

arendt's picture

the prices of hardware are pretty insane, even for hardware that isn't current gen. The reason for this? Bitcoin miners.

You don't say why you need a serious GPU, as opposed to the on-chip GPUs that come in the latest Intel chips. I know you work in the industry, but do you have to provide your own PC? Or do you have a hobby or side venture that uses GPUs? Do you program GPUs or just run commercial software on them?

I agree that GPUs are very expensive right now, even though the professional miners have moved on to custom ASICs. Hopefully the drop in the price of bitcoin will push a lot of the amateur miners, who do use GPUs, into liquidating their hardware. That would drop the price of used GPUs.

If any other PC enthusiasts can offer advice, I'd appreciate it. Either way, I suspect a lot of money is about to be spent.

Since I'm not in the market for hardware, I apologize that I can't give much advice. Let me just talk about how I perceive the requirements for hardware.

In my personal experience, non-work PCs are almost a commodity. Because they don't need a lot of horsepower to accomplish the things a home user needs: email, PC, stream video, run office software (I use LibreOffice to avoid the M$ fuckups and unwanted "upgrades".) I'm currently running on a 7 year old Apple iMac. Right up until the last release (Mojave) a couple of months ago, I was able to upgrade the OS, even though my DRAM was limited to 16 GB. I had replaced the CD-ROM under an extended warranty; and I had to pay to replace the hard drive, which was backed up by Time Machine. Other than that, zero problems. No failed security updates, no malware attacks.

I took a look at a new iMac two years ago; and other than a 5k retina display and a 3x faster CPU (well, maybe 3x as many cores, because the clock rate was about the same). It just didn't seem to be worth shelling out close to $2,000 for those few improvements. I'll ride this iMac into the ground; and when it goes, I'll buy a commodity machine and put Linux on it.

Apple has gotten just as greedy and stupid as M$ has been, except they used to deliver a quality product. Their closed ecosystem always pissed me off, because they don't allow you to do any repairs or upgrades.

That's all I've got. Sorry if it doesn't help.

up
0 users have voted.
Hawkfish's picture

@arendt

I recently had a work MacBook that I needed to upgrade for demos (I do research in high speed desktop computation) and was surprised to learn that my previous machine was seven years old. So was IT. No problems in all that time.

I’ve been using Macs for 35 years now and this is the usual story. Only chronic problem I recall was an underspecced power thermistor in the Mac plus. Fortunately some friends of mine were in the repair business and replaced it with a beefier one...

up
0 users have voted.

We can’t save the world by playing by the rules, because the rules have to be changed.
- Greta Thunberg

arendt's picture

@Hawkfish

Heat dissipation has always been an issue with Apple, for two reasons. First, because they are always striving for the smallest package, the slickest design. In such designs, keeping the product cool is less important than looking cool. Second, because Steve Jobs was a head case who found the sound of cooling fans to be annoying - so he banned them. The same kind of worthless obsession that forced the new spaceship HQ's doors and bannisters to have zero visible fasteners.

My iMac has no fans. It relies on the processor heat to generate convective air flow, and the metal walled case to suck the heat out where it can radiate. I had an earlier G4, the desktop machine in the plastic case that touted its computer control system for the fans (they realized that that turkey needed fans, but they still couldn't keep the box from cooking itself). The G4 fried itself in a little over a year. Apple wound up giving me a G5 for free when the G4 kept turning up for repair within the warranty period.

I will say that Apple does honor their warrantees. But the whole "Genius Bar" shtick wears on me.

up
0 users have voted.
WindDancer13's picture

Other than my first computer (which I custom ordered around 1985) and one other computer, I build my own. I have been keeping a close eye on prices as it is nearing the time for a new build, and they are ridiculous. However, nothing can beat creating something that will last and has exactly what you want.

The suggestion: Look for sales on pre-built computers that have the motherboard, CPU, PSU size and case that you want. There are a lot of really good bargains to be found this way. They will usually have about 4-8 GB memory and a smallish hard drive generally 500 GB along with integrated graphics, but you can add to the memory (rather than having to buy all of it separately), buy an additional hard drive (these are actually pretty cheap) and a good graphics card to get to what you need, usually cheaper than selecting everything piecemeal (although, that is how I build mine).

I have seen some pretty decent builds for under $750 lately. Most definitely steer away from proprietary manufacturers like Dell. Also, there are quite a few sites that let you custom order a basic computer build. For the graphics card, check out manufacturer sites for refurbished cards.

My 5 cents worth.

up
0 users have voted.

We are what we repeatedly do. Excellence, then, is not an act, but a habit.--Aristotle
If there is no struggle there is no progress.--Frederick Douglass

arendt's picture

They have a lot of open box and closeout stock, and the staff (at least in Cambridge, MA) knows its business.

https://www.microcenter.com/site/stores/default.aspx

Unfortunately for you, the closest one to Florida is north of Atlanta (Marietta or Duluth, GA). Google says the stores are about 300 miles drive from Tallahassee, and you don't drive.

up
0 users have voted.
Outsourcing Is Treason's picture

You should be able to get a USB external graphics card for your rig that does what you want.

up
0 users have voted.

"Please clap." -- Jeb Bush

arendt's picture

@Outsourcing Is Treason

because its hard, if not impossible, to add hardware to a laptop. I think the OP is in the market for a desktop.

But, the USB interface has nowhere near the bandwidth of a 32 or 64 wire internal bus. So,if your graphics app depends on constantly shuffling data to and from the GPU, its going to be slow.

I know this because I wanted to add an external GPU to my laptop, but my investigation of the technology turned up what I just said above.

up
0 users have voted.
arendt's picture

It's only an announcement, with a vague shipping date of 2019. And, 1 TFLOP is pretty lame compared to a state of the art graphics card. OTOH, state of the art graphics cards can cost upwards of $2,000.

So, what GPU to get really depends completely on what workloads you intend to run.

Intel Unveils 10nm Sunny Cove CPU Architecture With Gen11 Graphics, Major Performance Lift

Intel promises big boost to integrated GPU, breaks teraflop barrier

But, dont't consider my posting this news as an endorsement of Intel. as a DIYer, you should consider that AMD and Nvidia are way ahead of Intel in chip design.

Intel's integrated GPUs simply can’t compare to those on AMD chips or to discrete graphics from Nvidia. Intel has been getting creamed, and its own discrete GPU isn’t expected until 2020. Meanwhile, Intel publicly struggled to get its first 10nm processors, codenamed Ice Lake and based on its Skylake architecture, out the door. It’s repeatedly had to delay the chips making investors and consumers alike wary of the company’s future...

The first is improved integrated graphics. We know Intel has been investing heavily in rivaling AMD and Nvidia on the GPU front, and its new Gen11 integrated graphics won’t be the major step forward, but it will still be a big one. Current CPUs from Intel with integrated graphics are based on Gen9, which uses 24 enhanced execution units. Gen11 jumps that number to 64 and Intel claims it will double performance versus Gen9. The new integrated graphics will appear in 10nm chips next year

Intel Just Gave Us a Glimpse at the Near Future of CPUs

up
0 users have voted.