I searched for how to colorize an image (with the format svg or png ...).
I tried covering my image with a rectangle that fills the image, but as my image is not a rectangle it colorizes the whole rectangle and not just the image.
Is it possible to change image color with qml? Alternatively, is it possible to change the color on qt (with C++) with QPixmap then integrate the QPixmap on QML Item?
Thank you for your help. (If there is no solution, I will have to load a different image to the same basic image with different color.)
In Qt 5 (from 5.2) you may use ColorOverlay as follows:
import QtQuick 2.0
import QtGraphicalEffects 1.0
Item {
width: 300
height: 300
Image {
id: bug
source: "images/butterfly.png"
sourceSize: Qt.size(parent.width, parent.height)
smooth: true
visible: false
}
ColorOverlay {
anchors.fill: bug
source: bug
color: "#ff0000" // make image like it lays under red glass
}
}
In Qt 6 module QtGraphicalEffects (which includes ColorOverlay) was removed because of licensing issues.
In Qt 6.1 GraphicalEffects is now available again, as #SCP3008 commented below
You can replace your image color using ShaderEffect.
ShaderEffect {
property variant src: yourImage
property real r: yourColor.r * yourColor.a
property real g: yourColor.g * yourColor.a
property real b: yourColor.b * yourColor.a
width: yourImage.width
height: yourImage.height
vertexShader: "
uniform highp mat4 qt_Matrix;
attribute highp vec4 qt_Vertex;
attribute highp vec2 qt_MultiTexCoord0;
varying highp vec2 coord;
void main() {
coord = qt_MultiTexCoord0;
gl_Position = qt_Matrix * qt_Vertex;
}
"
fragmentShader: "
varying highp vec2 coord;
uniform sampler2D src;
uniform lowp float r;
uniform lowp float g;
uniform lowp float b;
void main() {
lowp vec4 clr = texture2D(src, coord);
lowp float avg = (clr.r + clr.g + clr.b) / 3.;
gl_FragColor = vec4(r * avg, g * avg, b * avg, clr.a);
}
"
}
The above code turns your image into the grayscaled one and then applies your color.
You can also use Colorize
And the difference with ColorOverlay is:
Colorize can really change the color of a image with HSL, it is more suitable for me.
ColorOverlay is similar to what happens when a colorized glass is put on top of a grayscale image with RGBA.
import QtQuick 2.12
import QtGraphicalEffects 1.12
Item {
width: 300
height: 300
Image {
id: bug
source: "images/bug.jpg"
sourceSize: Qt.size(parent.width, parent.height)
smooth: true
visible: false
}
Colorize {
anchors.fill: bug
source: bug
hue: 0.0
saturation: 0.5
lightness: -0.2
}
}
I also went looking for a non-QtGraphicalEffects solution and found a solution by using the icon property as follows:
icon.source - for setting the SVG image
icon.color - for changing the color of the SVG image
icon.width - for changing the width of the SVG image
icon.height - for changing the height of the SVG image
Some controls have the icon property such as Button, ItemDelegate and MenuItem.
In the following code, we implement AppIcon as a customized version of Button so that the only thing that remains is the icon rendering. We remove the button styling and the button size, it is clipped to just render the icon itself. Because we have customized the Button we can expose the clicked signal and the pressed property.
import QtQuick
import QtQuick.Controls
import QtQuick.Layouts
Page {
ColumnLayout {
anchors.centerIn: parent
AppIcon {
Layout.alignment: Qt.AlignHCenter
icon.source: "biking-32.svg"
icon.color: pressed ? "orange" : "red"
onClicked: status.text = "red clicked"
}
AppIcon {
Layout.alignment: Qt.AlignHCenter
Layout.preferredWidth: 64
Layout.preferredHeight: 64
icon.source: "biking-32.svg"
icon.color: pressed ? "orange" : "blue"
onClicked: status.text = "blue clicked"
}
AppIcon {
Layout.alignment: Qt.AlignHCenter
Layout.preferredWidth: 96
Layout.preferredHeight: 96
icon.source: "biking-32.svg"
icon.color: pressed ? "orange" : "green"
onClicked: status.text = "green clicked"
}
Text {
id: status
text: "click on an image"
}
}
}
//AppIcon.qml
import QtQuick
import QtQuick.Controls
Item {
id: appIcon
implicitWidth: 32
implicitHeight: 32
property alias icon: btn.icon
property alias pressed: btn.pressed
signal clicked()
Button {
id: btn
anchors.centerIn: parent
background: Item { }
icon.width: parent.width
icon.height: parent.height
onClicked: appIcon.clicked()
}
}
//biking-32.svg
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32"><path d="M16 5.5A2.5 2.5 0 1 1 18.5 8 2.498 2.498 0 0 1 16 5.5zm-1.042 19.02a1.545 1.545 0 0 0 1.021 1.688c.73.104 1.23-.25 1.452-1.211.034-.145 1.173-6.518 1.173-6.518l-3.979-2.146 2.823-4.087L19.498 15H24a1 1 0 0 0 0-2h-3.498l-2.434-3.27a1.63 1.63 0 0 0-.48-.551.995.995 0 0 0-.11-.083l-1.107-.765a1.685 1.685 0 0 0-2.194.595l-4.09 6.534a1 1 0 0 0 .367 1.408l5.114 2.8zM29.8 24.5a5.3 5.3 0 1 1-5.3-5.3 5.3 5.3 0 0 1 5.3 5.3zm-1.8 0a3.5 3.5 0 1 0-3.5 3.5 3.504 3.504 0 0 0 3.5-3.5zm-15.2 0a5.3 5.3 0 1 1-5.3-5.3 5.3 5.3 0 0 1 5.3 5.3zm-1.8 0A3.5 3.5 0 1 0 7.5 28a3.504 3.504 0 0 0 3.5-3.5z"/><path fill="none" d="M0 0h32v32H0z"/></svg>
You can Try it Online!
Related
I am trying to build an image slider using three.js and am having difficulties with wrapping my head around passing the appropriate state to the glsl shaders so I can transition between the slides. I can easily do it between two targets (be it textures or models) with simply easing between 0 and 1 and passing it as an attrib float like this:
attribute float mix;
vec4 color = mix(tex1, tex2, mix);
But I can't understand how to approach it with more then 2 targets. Should I pass a number and do a bunch of if statements?
I set up my buffer plane geometry and my shader material, which contains my 3 textures, like this:
const uniforms = {
time: { value: 0 },
tex1: { type: 't', value: null },
tex2: { type: 't', value: null },
tex3: { type: 't', value: null },
activeTexture: { type: 'i', value: 0 },
mixFactor: { value: 0 }
}
const vertexShader = document.querySelector('#vertex-shader').text
const fragmentShader = document.querySelector('#fragment-shader').text
const geometry = new THREE.PlaneBufferGeometry(80, 40, 20, 20)
const material = new THREE.ShaderMaterial({
uniforms,
vertexShader,
fragmentShader
})
// textures are loaded here...
// transition using GSAP
function shift () {
let ease = Power3.easeInOut
if (counter === 0) {
TweenMax.to(uniforms.mixFactor, 2, { value: 1, ease, onStart () {
uniforms.activeTexture.value = 1
} })
} else if (counter === 1) {
TweenMax.to(uniforms.mixFactor, 2, { value: 1, ease, onComplete () {
uniforms.activeTexture.value = 2
} })
} else if (counter === 2) {
TweenMax.to(uniforms.mixFactor, 2, { value: 2, ease, onComplete () {
uniforms.activeTexture.value = 0
} })
console.log(uniforms.activeTexture.value)
counter += 1
if (counter === 3) counter = 0
}
// glsl
// morph between different targets depending on the passed int attribute
void main () {
vec4 texColor = vec4(0.0);
if (activeTexture == 0) {
texColor = transition(tex1, tex2, vUv, mixFactor);
} else if (activeTexture == 1) {
texColor = transition(tex2, tex3, vUv, mixFactor);
} else if (activeTexture == 2) {
texColor = transition(tex3, tex1, vUv, mixFactor);
}
gl_FragColor = texColor;
}
This doesn't give me the desired effect (the textures abruptly switch between one another, don't transition into place, also it's a bit ugly). I am new to three and am clueless how should I even approach the problem. How does one do this?
I brought my 5 kopeikas :)
For example, we want to have transition for several pics. So we can use arrays in our uniforms.
Here we go
var uniforms = {
textures: {
value: []
},
transition: {
value: 0
}
};
var textureLoader = new THREE.TextureLoader();
textureLoader.setCrossOrigin("");
var pics = [
"https://threejs.org/examples/textures/UV_Grid_Sm.jpg",
"https://threejs.org/examples/textures/colors.png",
"https://threejs.org/examples/textures/planets/moon_1024.jpg",
"https://threejs.org/examples/textures/decal/decal-normal.jpg"
];
pics.forEach((p, idx)=>{
textureLoader.load(p, function(tex){
uniforms.textures.value[idx] = tex;
tex.needsUpdate = true;
})
});
Our geometry and vertex shader are usual:
var planeGeom = new THREE.PlaneBufferGeometry(10, 10);
var vertShader = `
varying vec2 vUv;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
`;
Magic comes here in our fragment shader, which built dynamically and based on the length of our array with links to pics:
var fragShader = `
uniform sampler2D textures[` + pics.length + `];
uniform float transition;
varying vec2 vUv;
vec4 getTexture(int index){
for(int i = 0; i < ` + pics.length + `; i++){
if (i == index){ return texture2D(textures[i],vUv); }
}
}
void main()
{
if (transition == 1.){
gl_FragColor = texture2D(textures[` + (pics.length - 1) + `], vUv); // show last
}
else {
float chunk = 1. / ` + (pics.length - 1) + `.; // amount of transitions = amount of pics - 1
float t = floor(transition / chunk);
int idx0 = int(t);
int idx1 = int(t) + 1;
gl_FragColor = mix(
getTexture(idx0),
getTexture(idx1),
(transition - (float(t) * chunk)) * ` + (pics.length - 1) + `.
);
}
}
`;
The solution is flexible enough, thus you can have as many transitions as you want.
jsfiddle example r86
I would do the mix in GLSL and rest outside the shaders managing what gets drawn. You can have one shader that takes 2 or more textures, transition between them, but once they get to 0 or 1, switch out the texture with another one. If you need just three though... this is overkill.
Something along the lines of this:
const myTransitionMaterial = new THREE.ShaderMaterial({
uniforms:{
uLerp: {value: 0},
uTexA: {value: null},
uTexB: {value: null},
},
vertexShader: vs,
fragmentShader: fs,
})
//lets say you have a list of a bunch of textures, and you add them
myTransitionMaterial.textures = [tex1,tex2,tex3]
//and you want to lerp through them linearly using 0-1 regardless of how many there are
myTransitionMaterial.lerp = (normalizedFactor)=>{
const length = myTransitionMaterial.textures.length
const index = normalizedFactor * length // 0-3
//at 0.00 we want 0-1 indecis and 0.00 f
//at 0.99 we want 0-1 indecis and 0.99 f
//at 1.00 we want 1-2 indecis and 0.00 f
//at 1.99 we want 1-2 indecis and 0.99 f
//at 2.00 we want 2-3 indecis and 0.00 f
//at 2.99 we want 2-3 indecis and 0.99 f
//at 3.00 we want 3-4 indecis and 0.00 f
const f = index - Math.floor(index)
const i0 = Math.floor(index)
const i1 = i0 <= length ? i0 + 1 : null //catch edge
this.uniforms.uLerp.value = f
this.uniforms.uTexA.value = this.textures[i0]
this.uniforms.uTexB.value = this.textures[i1]
}.bind(myTransitionMaterial)
vs:
varying vec2 vUv;
void main(){
vUv = uv;
gl_Position = vec4(position.xy,0.,1.);
}
fs:
uniform float uLerp;
uniform sampler2D uTexA;
uniform sampler2D uTexB;
varying vec2 vUv;
void main(){
gl_FragColor = vec4( mix( texture2D(uTexA, vUv).xyz, texture2D(uTexB, vUv).xyz, uLerp ), 1. );
}
An important concept to point out here is that if you do something like this, and try to lerp for the first time, your frame rate will get choppy as textures are displayed for the first time. This happens because the renderer will automatically upload them to the gpu as it first encounters them. For example, if you render a frame with each texture once, before even doing this transition, it's going to be smooth as butter.
if the number of textures is already set (and it should be as uniforms anyway) I would do it a little different:
i would define a float uniform that is your mixer and then use a 0-1 value to transition between the two. In this way you can animate the mixer variable however you like and the GLSL stays pretty simple:
uniform sampler2d t1;
uniform sampler2d t2;
uniform sampler2d t3;
uniform float mixer;
void main(){
vec4 c1 = texture2D(t1,vUv);
vec4 c4 = c1; //create a duplicate so you can loop back
vec4 c2 = texture2D(t2,vUv);
vec4 c3 = texture2D(t3,vUv);
float mp1 = .33333; //define the locations of t2
float mp2 = .66666; //and t3
float w= .33333; //define the width
c1 *= 1-mix(0.0,w,abs(mixer)); //this is at 1 when mixer is 0 & 0 when .333
c2 *= 1-mix(0.0,w, abs(mixer-mp1)); //this is 1 when .333 & 0 when 0<mixer>.666
c3 *= 1-mix(0.0,w,abs(mixer-mp2)); //this is 1 when .666 & 0 when .333<mixer>1.0
c4 *= 1-mix(0.0,w,abs(mixer-1.0)); //this is 1 when 1 & 0 when .666<mixer
gl_FragColor=c1+c2+c3+c4; //now it will only ever be a mixture of 2 textures
}
So then you do some border function on mixer so that
if(mixer > 1)mixer --;
if(mixer < 0)mixer ++;
and then you can go from T1 to T2 by tweening from 0-0.3333. You can go from T2 to T3 by tweening from .333 to .666, and from T3 to T1 by tweening from .6666 to 1.0 and so on.
Then you just need to do a little management so that your tweens go circularly- ie, if the distance from current position to a target position is greater than 1/3 some amount you do a jump from 0 to 1 or from 1 to 0
I've been searching for hours now - without much luck.
I have a widget with the windowsflag Qt::Popup set, and I'm trying to create smooth rounded corners.
I've tried using a stylesheet, but then the transparent part of the corners become black. The same happens if a overwrite the widget's paint event and draw a rectangle with rounded corners in the widget.
I've also tried to set a mask, but the result gets very pixelated.
After some reading I found out that the black corners appears because the widget is a top-level widget. But I would think that it's still possible somehow?
Does anyone know what I can do to either get rid of the black corners or to smoothen the mask? Any ideas are appreciated!
The paint event:
void PopUp::paintEvent(QPaintEvent *event)
{
QPainter painter(this);
QColor greyColor(0xFFC5C6C6);
QRect rect(0, 0, width(), height());
painter.setRenderHint(QPainter::Antialiasing);
painter.setBrush(QBrush(greyColor));
painter.setPen(QPen(greyColor));
painter.drawRoundedRect(rect, 10, 10);
}
The function that sets the mask:
void PopUp::setRoundedCorners(int radius)
{
QRegion verticalRegion(0, radius, width(), height() - 2 * radius);
QRegion horizontalRegion(radius, 0, width() - 2 * radius, height());
QRegion circle(0, 0, 2 * radius, 2 * radius, QRegion::Ellipse);
QRegion region = verticalRegion.united(horizontalRegion);
region = region.united(circle);
region = region.united(circle.translated(width() - 2 * radius, 0));
region = region.united(circle.translated(width() - 2 * radius, height() - 2 * radius));
region = region.united(circle.translated(0, height() - 2 * radius));
setMask(region);
}
Create a widget with Qt::Window | Qt::FramelessWindowHint and Qt::WA_TranslucentBackground flag
Create a QFrame inside of a widget
Set a stylesheel to QFrame, for example:
border: 1px solid red;
border-radius: 20px;
background-color: black;
I need to remove all odd lines from a texture - this is part of a simple deinterlacer.
In the following code sample, instead of getting the RGB from texture, I choose to output white colour for odd line and red colour for even line - so I can visually check if the result is what I expected.
_texcoord is passed in from vertex shader and has a range of [0, 1] for both x and y
uniform sampler2D sampler0; /* not used here because we directly output White or Red color */
varying highp vec2 _texcoord;
void main() {
highp float height = 480.0; /* assume texture has height of 480 */
highp float y = height * _texcoord.y;
if (mod(y, 2.0) >= 1.0) {
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
} else {
gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
}
}
When render to screen, the output isn't expected. Vertically it's RRWWWRRWWW But I am really expecting RWRWRWRW (i.e. alternate between red and white color)
My code run on iOS and target to GLES 2.0 so it should be no different on Android with GLES 2.0
Question: Where did I do wrong?
EDIT
Yes the texture height is correct
I guess my question is: given a _texcoord.y, how to tell if it's referring to odd or even line of the texture.
void main(void)
{
vec2 p= vec2(floor(gl_FragCoord.x), floor(gl_FragCoord.y));
if (mod(p.y, 2.0)==0.0)
gl_FragColor = vec4(texture2D(tex,uv).xyz ,1.0);
else
gl_FragColor = vec4(0.0,0.0,0.0 ,1.0);
}
I'm using OpenGL ES 2.0 to create the following scene:
Draw a background image on the entire screen and above it draw another overlay image that fades in and out (alpha changes)
I also need to use "Screen blend" to blend the overlay and the background textures.
So I created a shader that blends the two textures, I thought that I can use a uniform (randomAlpha) to change the overlay texture alpha over time and create a fade animation, but the following code doesn't do the trick, the overlay texture opacity doesn't changes!
I know there is an "Alpha Blend" that I can use to blend the overlay and the background textures, but the problem is that I want the final overlay (after opacity changes) to blend the background with a "Screen Blend" not an "Alpha Blend"
This is how my fragment shader main method looks like:
void main()
{
mediump vec4 background = texture2D(backgroundTexture, backgroundCoords);
mediump vec4 overlay = texture2D(overlayTexture, overlayCoords);
overlay.a = randomAlpha;
// add overlay to background (using screen blend)
mediump vec4 whiteColor = vec4(1.0);
gl_FragColor = whiteColor - ((whiteColor - overlay) * (whiteColor - background));
}
Clearly I missing something important here..
How can I change the texture's opacity? how can I create the fade effect?
Like you already figured it's just a matter of blending.
What you're missing is that you need to re-calculate the rgb channels, when you change the alpha value when you want to blend to textures together.
void main()
{
mediump vec4 background = texture2D(backgroundTexture, backgroundCoords);
mediump vec4 overlay = texture2D(overlayTexture, overlayCoords);
overlay.a = randomAlpha;
background.a = 1.0 - overlay.a;
gl_FragColor = vec4(background.rgb * background.a + overlay.rgb * overlay.a, 1.0);
}
Here is a simplified version of the above code, the above code is just easier to understand and read.
void main()
{
mediump vec4 background = texture2D(backgroundTexture, backgroundCoords);
mediump vec4 overlay = texture2D(overlayTexture, overlayCoords);
overlay.rgb *= randomAlpha;
background.rgb *= 1.0 - randomAlpha;
gl_FragColor = vec4(background.rgb + overlay.rgb, 1.0);
}
What do i have to change in this vertexshader/fragmentshader from simple grayscale depth to RGBA encoded depth, especially to display ChromaDepth(tm)-color-scheme instead of grayscale?
http://www.chromatek.com/pix/101color.jpg
<script id="vert" type="webgl/fragment-shader">
uniform float near;
uniform float far;
varying vec3 color;
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
float depth = 1.0 - ((gl_Position.z - near) / (far - near));
color = vec3(depth);
}
</script>
<script id="frag" type="webgl/fragment-shader">
varying vec3 color;
void main() {
gl_FragColor = vec4(color, 1.0);
}
</script>
You are outputting the color in the line color = vec3(depth). This variant of the function vec3 creates a vector where all 3 parts are equal to the input value. You can just as well use it to create a 3-part vector: color = vec3(red, geen, blue). When you also want to pass an alpha value to the fragment shader, you have to change this to a vec4(red, green, blue, alpha), and also change the declaration of varying vec3 color to varying vec4 color in both the vertex- and the fragment shader.
The algorithm used by ChromaDepth to calculate the correct red, green and blue value from the depth is published here. This is the relevant section:
//Definition of 3d_red component of the color. The value show be between 1
//and 0 over the Range of 0 to 0.75. It should be 0 for all Ranges greater
//than 0.75. From 0 to 0.75 it is calculated by Red_func.
define Red_Range Range/0.9
define Red_func
(-2.13*Red_Range^4-1.07*Red_Range^3+0.133*Red_Range^2+0.0667*Red_Range+1)
define Cc (Red_func <0 || Red_Range>0.75 ? 0:1)
define Dd (Red_func >1 ? 1:0)
define 3d_red (Red_Range<0.75 ? Red_func:Cc*Dd)
//Definition of 3d_green component of the color. The value should be between
//0 and 1 over the Range of 0 to 1, starting from 0, rising to 1, then falling
//to 0 again. It should be 0 at both extremes of Range.
define Green_func1 (1.6*Range^2+1.2*Range)
define Green_func2 (3.2*Range^2-6.8*Range+3.6)
define 3d_green (Range<=0.5 ? Green_func1:Green_func2)
//Definition of 3d_blue component of the color. The value should rise from
//0 at a Range of 0.5 up to 1 at a Range of 1. Below Range 0.5 the value
//must be 0.
define Blue_func (-4.8*Range^2+9.2*Range-3.4)
define 3d_blue (Range>0.5 ? Blue_func:0)
The input value of these functions is "Range" which is what you call "depth" in your code (0.0 is closest and 1.0 is furthest away).
Thank you Philipp. Here's my implementation:
<script id="vert" type="webgl/fragment-shader">
uniform float near;
uniform float far;
varying vec3 color;
float func_r(float r_range)
{
float r_depth = (-2.13*r_range*r_range*r_range*r_range-1.07*r_range*r_range*r_range+0.133*r_range*r_range+0.0667*r_range+1.0);
return r_depth;
}
float func_g1 (float g_range)
{
float g_depth = (1.6*g_range*g_range+1.2*g_range);
return g_depth;
}
float func_g2 (float g_range2)
{
float g_depth2 = (3.2*g_range2*g_range2-6.8*g_range2+3.6);
return g_depth2;
}
float func_b (float b_range)
{
float b_depth = (-4.8*b_range*b_range+9.2*b_range-3.4);
return b_depth;
}
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
float depth = 0.0 + ((gl_Position.z - near) / (far - near));
/* ### Definition of 3d_red component of the color.*/
float Red_Range = depth / 0.9;
float Cc = (func_r(Red_Range) < 0.0 || Red_Range > 0.75 ? 0.0:1.0);
float Dd = (func_r(Red_Range) > 1.0 ? 1.0:0.0);
float calc_r = (Red_Range < 0.75 ? func_r(Red_Range) : Cc*Dd);
/* ### Definition of 3d_green component of the color.*/
float calc_g = (depth <= 0.5 ? func_g1(depth) : func_g2(depth));
/* ### Definition of 3d_blue component of the color.*/
float calc_b = (depth > 0.5 ? func_b(depth) : 0.0);
color = vec3(calc_r,calc_g,calc_b);
}
</script>
<script id="frag" type="webgl/fragment-shader">
varying vec3 color;
void main() {
gl_FragColor = vec4(color, 1.0);
}
</script>