How to do gradient transitions using slimdx? - slimdx

I am doing gray scale transitions between different images and have an idea that i can do it by doing some blending but can any body please help me how to actually implement it.
Thanks

This has been done language-agnostically here
It seems it can be done in a pixel shader, but speed concerns require precomputing blending ratios.

Related

How SVGs are able to incoporate effects like blur?

I have been familiar with the SVG format since a long time. It's usability and benefits over a raster image as well.
But Recently I came to a situation where I needed blur effect in SVG(basically a asset defined by primitive shapes that mimics blur effect and is infinitely scalable), so I did a google search and much to my surprise there are official ways of doing it; I was expecting it not to be!
I am basically intrigued by the fact that if SVGs are really made up primitives shapes defined mathematically then how can it incorporate effect like blur. What shape can even be used for such a process?
In Firefox we render the SVG to an offscreen surface, blur the pixels on that surface and then blit the offscreen surface. I imagine other browsers work similarly.
Filters and masks are raster operations, most everything else is vector.

THREE.js How to increase texture quality

What are the possible and good ways/best practices/etc to improve texture quality in THREE.js?
I have a scene where I have planes(cards) with 512x512px textures. How it looks you can see on images below. My problem is that textures looks blurred. I have tried to change filters and value of anisotropy and it helps, but just a little and texture still blurred. The only one way that I found when texture looks like I want - increase render size x2 and keep canvas size the same. It is bad way because of performance issues, but I don't find another way to get good texture quality.
The best quality - render size x2
Normal quality - magFilter = minFilter = THREE.LinearMipMapLinearFilter /anisotropy = 16
Bad quality - no filters
I hope for any help, thanks in advance
You hardly can do better than trilinear filtering with 16x anisotropic (and not all hardwares can achieve 16x anisotropic filtering).
However, you say your textures are 512x512, while (if your snapshots are real-size) it appear clear that:
they are rendered way smaller thant 512x512. It mean this is currently a lower mipmap level that is used to render your cardes, a mipmap generated by WebGL.
Your cards are rectangular while your textures are square. Depending how you mapped texture on your shape, this could mean the aspect-ratio change, so the sampler need to do some more interpolation (so filtering, meaning more blur)
So what you can try to do, is to:
use smaller base texture, 256x256 for example, which you done yourself with the best sharpness you can, so no min-filter is needed while WebGL sample the texture.
Adapt the mesh texture coordinates to your texture or vice versa to avoid aspect-ratio changes during texture sampling.

Three.js - SSAO defect transparent area

I've been trying to add SSAO into my game, per this SSAO EXAMPLE
Unfortunately, my transparent trees now defected:
Please advise on how to fix it.
This is not a defect, during the depth-Path, which you need to pass to the SSAO shader further on, transparency is not taken into account, thus your leave-defining planes get detected as planar geometry and get the appropriate outline.
Concerning a solution, I cannot really help you. What you can do is hide all the transparent stuff before rendering the depth pass but then, the AO-pass gets multiplied over it, so you just trade in a visual problem against another one. To really solve this, i think you need an additional Three.MaskPass, see here:
Rendering multiple scenes, with only 1 using SSAO [Three.js]
Hope this helps.
You could pass the texture into the normal/depth fragment shader and discard any fragments with an alpha < 0.5

Drawing transparent sprites in a 3D world in OpenGL(LWJGL)

I need to draw transparent 2D sprites in a 3D world. I tried rendering a QUAD, texturing it(using slick_util) and rotating it to face the camera, but when there are many of them the transparency doesn't really work. The sprite closest to the camera will block the ones behind it if it's rendered before them.
I think it's because OpenGL only draws the object that is closest to the viewer without checking the alpha value.
This could be fixed by sorting them from furthest away to closest but I don't know how to do that
and wouldn't I have to use math.sqrt to get the distance? (I've heard it's slow)
I wonder if there's an easy way of getting transparency in 3D to work correctly. (Enabling something in OpenGL e.g)
Disable depth testing and render transparent geometry back to front.
Or switch to additive blending and hope that looks OK.
Or use depth peeling.

Generating fast color rectangles

I am designing a more powerful color picker for Qt and looking for some advice. How would one go about generating fast real-time color rectangles such as the ones found in Photoshop (for HSB and RGB). I was originally thinking of using QImage and scanline to calculate all the pixels individually but this would probably be too slow.
I was thinking it would be better to write an OpenGL shader. As I can recall you can assign colors to vertices and it would interpolate the changes for you. I just have no idea how this would be done in Qt or if this is even worth the effort.
I am using QGraphicsView to display the rectangle. Any advice would be appreciated.
Ok so looking into QGradients a bit more could you not use multiple QGradient to create the effect you need?
For the last of the 3 examples you could create a single gradient with multiple stops for the colours themselves then overlay this with a QGradient of black (alpha 0) to black (alpha 255) with apropriate stops to get the gradient to come in at the right point.

Resources