Can I completely disable a PCI-slot in Linux? [closed] - linux-kernel

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Like many of you, I like to play a game every once in a while. Since I use Linux for my daily tasks (programming, writing papers, browsing, etc.), I solely exploit my graphics-card capabilities in Windows while gaming.
Lately I noticed my energy-bills got really high and I would like to reduce the energy consumption of my computer. My graphics-card uses 110 watt idle, whereas a low-end Radeon HD5xxx only uses 5 watt. I think my computer is powered on 40 hours a week, whereof only 3 hours of gaming. This means I waste 202 kWh a year (!).
I figured I could just buy a DVI splitter and a low-end Radeon-card, and disable the PCI-slot of the high-end card in Linux. I Googled a bit, but I'm not sure which search-terms to use, so I haven't found anything useful.
Too long, didn't read: Is it possible to cut of the power of a PCI-slot using Linux?

No.
What your asking isn't even a "Linux" question, but a motherboard question - is it electrically possible to do this.
The answer is still no.
The only chance you would have, would be to get the spec of the chip/card which is in the slot, and see if there is a bit you can set on it which would "disable" it, or put it into some "low power mode".

Related

Effect on performance of lowering resolution in the settings vs performance of real physical laptop already having that resolution [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I'm sorry that the title is confusing, but it is so complicated I couldn't make a good title. I don't know if it's in the correct stackexchange subdivision in the first place, if it's in the wrong forum, please migrate it!
HD = 1920x1080, UHD = 3840x2160
Let's say I buy a laptop with UHD screen and for example NVIDIA GeForce 980M card. Some games may run slow because the screen is very big.
Now I change the graphics settings for a lower resolution from UHD to like HD. Will changing from an UHD to HD in the graphics settings make the laptop's performance equally fast to the performance of another laptop that is physically build with HD screen, or will it still run a bit slower?
It's basically impossible to give an exact answer to this without measuring the exact computers, but either way, you are not going to notice any difference.
If they cost the same, the difference would be from the laptop manufacturer spending their hardware budget on a higher resultion screen rather than a better processor etc.
If the software and hardware is the same, except for the screen, you wouldn't notice any difference except the price of the computer.

Human computer interaction - waiting in line [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
As part of my assignment, I am supposed to find out what would users like to do while they're waiting in line (for anything).
I would appreciate if you guys could provide your input.
Any time you give users something to do while your process is working, you risk massively alienating all of them. In my experience, users hate waiting entertainment especially if it creates the feeling that it might extend the actual waiting.
The very best thing to do in this situation IMO is to tell the users exactly how much longer they will have to wait so they can fill the time productively on their own.
They may get their cellphones or smartphones to play games, browse
on mobile internet, watch mobile TV, do SMS or call their friends.
There are others with newspapers and books and read these while they are in line.
Others might be daydreaming.
There are people busy listening to music.
If they have companions, they are going to chat with them.
Part of your assignment in what? What does this have to do with programming? Do you intend to program the people in line? If so, I recommend the Clockwork Orange approach, complete with the 9th.

How to make windows freeze for short period of time [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Can I make windows 7 freeze for a short period of time,
during this time I want it to perform absolutely no action.
In particular, the OS should not access to any storage device.
My original idea was to crate an interrupt, however, I think that long interrupt will cause a blue screen (which I want to prevent).
I can tell you for a fact that this is not possible - if you do this to the primary volume the machine will bluescreen because of a watchdog timer, even for a short time. If you disconnect the drive, it will immediately bluescreen. If you do this to a secondary volume, it will be surprise removed.
If you have less than a team of 10-20 very skilled NT kernel developers / testers, this idea is not even beginning to be practical. What is your scenario and what you're trying to accomplish at a high level?

Proximity sensor recommendation to detect hand & blood [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
My company is interested in designing a protection system for small industrial machines, where factory employees insert wooden objects that are then cut by these machines.
The protection system needs to be aware of the presence of a human arm when inserted into the input-hall - the human arm will then put a piece of wood and the machine will cut the wood - however, the protection system needs to be able to detect the presence of blood in case for some reason the arm is cut - and in that case shut off the power of the industrial machine.
I'm no expert on sensing technologies and as we are looking to hire one, I am asking for advice on the proper sensing technology that can fit these requirements.
Capacitive sensing, as I understand - can not only detect the presence of an object - but also the type (e.g. distinguish human arms from blood) - can such technology be used for the purpose mentioned above?
Thanks,
Arkadi
I wonder if something like this could help: http://www.sawstop.com/ I'm a woodworker as well and have been considering this device. I am not sure if it can distinguish arm from blood but it seems to be able to sense 'flesh' and shut down.

Origin of "embarrassingly parallel" phrase [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
For the purposes of history on wikipedia, is anyone familiar with the origin of the phrase "embarrassingly parallel". I've always thought that it may have been coined by a random Google employee who first worked on map-reduce. Does anyone have any concrete info on the origin?
The first use I could find in an advanced Google book search was from an IEEE Computer Society digest published in 1978. The context and the fact that the author had "embarrassingly" in quotes indicates to me that the phrase was not coined here, but had been used before this.
It's decades old, but I first heard it used in reference to graphics rendering. Imagine you're rendering an animated movie: each frame is 2000x1000 pixels, there are 24 frames per second, 60 seconds in a minute, and 100 minutes in the movie. That's almost 300 billion pixels that can all be computed in parallel. That's so parallel that it's embarassing to compute it serially.
Try this search : http://www.google.co.nz/search?rlz=1C1GGLS_enNZ364NZ365&sourceid=chrome&ie=UTF-8&q=etymology+embarrassingly+parallel

Resources