I am using ILNumerics. I just tested some simple examples from the ILNumerics website.
If I choose to use the GDI renderer in the properties panel of the ILPanel control it works fine.
If I choose the OpenGL renderer the scene is plotted wrongly, is blinking or is partially or not drawn.
I use VS2010 PRO, Win7 64-bit, Dell XPS 17 with Ge550M graphics.
Fogcity runs without problems using OpenGL. Any idea?
You are probably not using the geforce card but some onboard graphic chip for rendering. Make sure, the NVIDIA card is used. Check the NVIDIA control panel.
Related
I develop a test application with directx 11 und fl 10.1.
Everything is working as expected and fine, but when I maximize the window with my graphics in it, the time per frame increases drastically. like 1ms to 40ms.
NVS 300 graphics card
Windows 7 32-bit
Application that draws few sinuses with direct3d, c# via sharpdx
Windows forms with a control and sharpdx initialized swapchain, programmed to change backbuffer on resize event (would occur without that too though)
I used a System.Stopwatch to find the issue at the code line:
mSwapChain.Present(1, PresentFlags.None);
where the time it needs when maximized increases by a lot suddenly.
any clues?
In my specific case, switching to windows classic theme with aero disabled solved the issue. Because the frame performance got worse if the windows start button started to lay over the resized window.
I'm developing a cross-platform photo retouching application based on Gtk-2 (but already able to support Gtk-3 with minor modifications).
In my program, the result of the image retouching is previewed in a scrollable area that is implemented through a Gtk::DrawingArea inserted into a Gtk::ScrolledWindow. The drawing itself is performed using Cairo.
Recently I had the possibility to test the software on a MacBookPro laptop with retina display, and I've immediately realised that the preview image gets magnified by a factor of 2, like all the rest of the GUI elements.
Is there a way to tell Cairo and the DrawingArea to use the native screen resolution, instead of applying a 2x magnification? Is this supported in recent Gtk-3 versions?
Thanks for you help.
I presume dwm holds bitmap data of each rendered window in the GPU. Can I access this data? I want to use it as a texture in D3D (or preferably OpenGL). Screenshotting each window to RAM and back to GPU is too slow.
Ive seen other posts like : obtaining full desktop screenshot from the GPU
so Im doubtful, but maybe something has changed in the last 3 years.
Edit
So do all applications use Direct3D to draw all components? Would, say, this chrome browser's content, or file explorer's, or anything exist as an image in the graphics card or are only borders and such rendered through Direct3D/2D? Want to make sure before pursuing. BTW: my idea is a desktop for the Rift without running an alternate shell.
I am new to 3D libgdx api, when I tried to run the animation, it shows a lot of randomly placed triangles on my screen, I followed this one, libGDX: 3d animation not working, when I play the animation in blender, it is okay. I exported it by default settings. I have placed all the .obj, .mtl, .fbx, all the textures, and I have read all the tutorials and comments of Mr.Awesome Xoppa, but still no result. Help will be much appreciated.
I have tried it on windows 7, OpenGL 2.0, intel GMA x3100, nightly builds of libgdx, even a simple knight animation in the gdx-test doesn't work, but the static meshes work fine. Today I tried it on linux, Mesa3D OpenGL 2.1, Ubuntu 12.04LTS, it works fine but there are some dark lines surrounding the animation. I think my archaic hardware and software support that causes it.
The answer is there is no way of running the animation in Windows 7, as it only supports OpenGL 2.0, but in Ubuntu it will work, as Mesa3D supports OpenGL 2.1. The black lines are the vertices that are not assigned to the bone.
Until Windows Vista, ATI and nVidia supported a feature called horizontal span, which combined two monitors into a single larger screen.
This feature allows the taskbar to span across both monitors, allows games to be played in fullscreen across both monitors, and allow Remote Desktop Connection to span both monitors without the /span feature (I have four monitors, so my total screen width is more than RDP's limit of 4096 pixels, making /span very annoying. Also, the ActiveX control doesn't support it)
The Vista drivers from these companies do not support this feature, and it appears that they never will.
What changes were introduced by the WDDM that made this feature impossible?
Horizontal span can be implemented in Vista.
But not with the nVidia cards for now.
Microsoft has changed the WDDM architecture in Vista and nVidia (and other companies) should change their previous drivers to met its requirements to provide the horizontal span.
But unfortunately nVidia haven't done.
I am using Matrox Graphics cards and I have the horizontal span and other features in Vista, enjoying WoW in Vista.
You and other nVidia users should ask nVidia to stop being lazy and start upgrading their drivers!