Trouble with OpenLayers Styles - coding-style

So, tired of always seeing the bright orange default regular polygons, I'm trying to learn to style OpenLayers.
I've had some success with:
var layer_style = OpenLayers.Util.extend({},OpenLayers.Feature.Vector.style['default']);
layer_style.fillColor = "#000000";
layer_style.strokeColor = "#000000";
polygonLayer = new OpenLayers.Layer.Vector("PolygonLayer");
polygonLayer.style = layer_style;
But sine I am drawing my polygons with DrawFeature, my style only takes effect once I've finished drawing, and seeing it snap from bright orange to grey is sort of disconcerting. So, I learned about temporary styles, and tried:
var layer_style = new OpenLayers.Style({"default": {fillColor: "#000000"}, "temporary": {fillColor: "#000000"}})
polygonLayer = new OpenLayers.Layer.Vector("PolygonLayer");
polygonLayer.style = layer_style;
This got me a still orange square--until I stopped drawing, when it snapped into completely opaque black. I figured maybe I had to explicitly set the fillOpacity...no dice. Even when I changed both fill colors to be pink and blue, respectively, I still saw only orange and opaque black.
I've tried messing with StyleMaps, since I read that if you only add one style to a style map, it uses the default one for everything, including the temporary style.
var layer_style = OpenLayers.Util.extend({}, OpenLayers.Feature.Vector.style['default']);
var style_map = new OpenLayers.StyleMap(layer_style);
polygonLayer = new OpenLayers.Layer.Vector("PolygonLayer");
polygonLayer.style = style_map;
That got me the black opaque square, too. (Even though that layer style works when not given to a map). Passing the map to the layer itself like so:
polygonLayer = new OpenLayers.Layer.Vector("PolygonLayer", style_map);
Didn't get me anything at all. Orange all the way, even after drawn.
polygonLayer = new OpenLayers.Layer.Vector("PolygonLayer", {styleMap: style_map});
Is a lot more succesful: Orange while drawing, translucent black with black outline when drawn. Just like when I didn't use a map. Problem is, still no temporary...
So, I tried initializing my map this way:
var style_map = new OpenLayers.StyleMap({"default": layer_style, "temporary": layer_style});
No opaque square, but no dice for the temporary, either... Still orange snapping to black transparent. Even if I make a new Style (layer_style2), and set temporary to that, still no luck. And no luck with setting "select" style, either.
What am I doing wrong? Temporary IS for styling things that are currently being sketched, correct? Is there some other way specific to the drawFeature Controller?
Edit: setting extendDefault to be true doesn't seem to help, either...
var style_map = new OpenLayers.StyleMap({"default": layer_style, "temporary": layer_style}, {"extendDefault": "true"});

I've found two solutions for this problem. In both solution, you have to change some parameters of DrawFeature to get the functionality you wish.
1.Change handler style of the DrawFeature. Function drawFeature in OpenLayers.Handler.Polygon uses parameter style of the handler for the feature. So you have to change this style.
When creating Feature use:
var drawPolygon = new OpenLayers.Control.DrawFeature(polygonLayer, OpenLayers.Handler.Polygon, {handlerOptions:{style:myStyle}});
Later, you can change it by:
drawPolygon.handler.style = myStyle;
2.Change create callback of the DrawFeature. Change style of the newly created temporary feature in create callback.
var drawPolygon = new OpenLayers.Control.DrawFeature(polygonLayer, OpenLayers.Handler.Polygon, {
callbacks:{create: function(vertex, feature) {
feature.style = myStyle;
this.layer.events.triggerEvent("sketchstarted", {vertex:vertex,feature:feature})
}}});
Similarly, you can change the callback later.

If you want all vectors to be of a constant style, but not the boring orange then try this:
vecLayer = new OpenLayers.Layer.Vector(
"Route Layer", //layer name
{styleMap: new OpenLayers.StyleMap({
pointRadius: "6",
fillColor: "#666666"
}),
renderers:renderer}
);
You have loads of properties you can mess about with, have a look at these pages:
dev.openlayers (check the Constants section)
docs.openlayers (more useful info)

Related

Three.js PMREMGenerator has incorrect texture filtering

I have a scene in Three (using AFrame) that requires environment mapping for lighting. Using a standard HDR cubemap, I get the following results:
This is correct as far as blurring based on roughness goes, since I have mipmaps being generated and the minFilter set to LinearMipmapLinearFilter. The issue with this approach is that ambient lighting isn't being applied - the light in the scene, a directional light, is the only thing providing any lighting information to the scene. Unfortunately, this results in entirely black shadows, no matter how bright the HDRI.
However, if I use the PMREMGenerator from Three in addition to the above, the ambient lighting issue is solved. Unfortunately, this is what happens as a result:
As shown here, the texture filtering is now out of whack. According to comments left in the PMREMGen script itself.
This class generates a Prefiltered, Mipmapped Radiance Environment Map (PMREM) from a cubeMap environment texture. This allows different levels of blur to be quickly accessed based on material roughness. It is packed into a special CubeUV format that allows us to perform custom interpolation so that we can support nonlinear formats such as RGBE. Unlike a traditional mipmap chain, it only goes down even more filtered 'mips' at the same LOD_MIN resolution, associated with higher roughness levels. In this way we maintain resolution to smoothly interpolate diffuse lighting while limiting sampling computation.
...which leads me to believe the output should be smoothed, like in my first example.
Here's my code for the first example:
const ctl = new HDRCubeTextureLoader();
ctl.setPath(hdrPath);
ctl.setDataType(THREE.UnsignedByteType);
const hdrUrl = [
`${src}/px.hdr`,
`${src}/nx.hdr`,
`${src}/py.hdr`,
`${src}/ny.hdr`,
`${src}/pz.hdr`,
`${src}/nz.hdr`
];
const hdrSky = ctl.load(hdrUrl, tex => this.skies[src] = hdrSky );
// Then later...
obj.material.envMap = this.skies[this.skySources[index]];
obj.material.needsUpdate = true;
Here's my code for the second example:
const ctl = new HDRCubeTextureLoader();
ctl.setPath(hdrPath);
ctl.setDataType(THREE.UnsignedByteType);
const hdrUrl = [
`${src}/px.hdr`,
`${src}/nx.hdr`,
`${src}/py.hdr`,
`${src}/ny.hdr`,
`${src}/pz.hdr`,
`${src}/nz.hdr`
];
const pmremGen = new PMREMGenerator(this.el.sceneEl.renderer);
pmremGen.compileCubemapShader();
const hdrSky = ctl.load(hdrUrl, tex => {
const hdrRenderTarget = pmremGen.fromCubemap(hdrSky);
this.skies[src] = hdrRenderTarget.texture;
});
// Then later...
obj.material.envMap = this.skies[this.skySources[index]];
obj.material.needsUpdate = true;
I seem to have hit a wall in regard to the filtering. Even when I explicitly change the filtering type and turn on mipmap generation inside of PMREMGenerator.js, the results appear to be the same. Here's an official example that uses the PMREMGenerator without any issue: https://threejs.org/examples/webgl_materials_envmaps_hdr.html
As a closing remark I will note that we're using Three.js r111 (and there are reasons we can't switch it out fully), so I brought in the current version of the PMREMGenerator from the latest version of Three as of this writing (r122 - later version needed as the r111 one is written completely differently). Thus, I wouldn't be surprised if this was all being caused by some conflict between versions.
EDIT: Just put the resulting envmap as a standard map on some planes, and much to my surprise none of the blurred LODS even show up. Here's what mine look like:
And here's what it should resemble (don't mind the torus knot):
EDIT: I've found a workaround for now (essentially, not using PMREMGenerator), but will leave this up in case a solution is discovered.
I believe the problem is with the argument you're using in fromCubemap(). Right now your code isn't using the tex cubetexture variable that's passed to the loaded callback. Try this:
const hdrSky = ctl.load(hdrUrl, tex => {
// use tex instead of hdrSky
const hdrRenderTarget = pmremGen.fromCubemap(tex);
this.skies[src] = hdrRenderTarget.texture;
});

How do I change the color of already drawn InkStrokes in windows universal

I have drawn some ink strokes on an InkCanvas and am now wanting to change the pen colour. I can change the colour of any additional strokes I draw using CopyDefaultDrawingAttributes and UpdateDefaultDrawingAttributes and that works fine. But how do I alter the color of the strokes that are already present StrokeContainer? I've tried:
foreach (InkStroke stroke in inkCanvas.InkPresenter.StrokeContainer.GetStrokes())
{
stroke.DrawingAttributes.Color = strokeColour;
};
This code executes with no exceptions, but stroke.DrawingAttributes.Color still shows the previous colour.
Any ideas?
Thanks...
Robert
You cannot set the DrawingAttributes property of the stroke directly. You must create a copy of the InkDrawingAttributes of the stroke, set the desired values for that InkDrawingAttributes object, and then assign the new InkDrawingAttributes to the DrawingAttributes of the stroke.
So you can code for example like this:
foreach (InkStroke stroke in inkCanvas.InkPresenter.StrokeContainer.GetStrokes())
{
//stroke.DrawingAttributes.Color = Windows.UI.Colors.Yellow;
InkDrawingAttributes drawingAttributes = new InkDrawingAttributes();
drawingAttributes.Color = Windows.UI.Colors.Yellow;
stroke.DrawingAttributes = drawingAttributes;
}
For more information, you can refer to InkStroke.DrawingAttributes | drawingAttributes property.

How to fix tiny holes in the Mesh after subtract operation with Three.CSG

I have a Box Mesh where I subtract another Box with Three.CSG to create a wall with a window.
After doing so, there are tiny holes in the Mesh alongside the cut. They are not visible alle the time, but show up when moving around.
How to close these holes?
This is part of the code how I am creating the Mesh:
var wallBsp = new ThreeBSP( myWallMesh );
var subMesh = new THREE.Mesh( mygeo );
var subBsp = new ThreeBSP( subMesh );
var subtract_bsp = wall_bsp.subtract( subBsp );
var result = subtract_bsp.toMesh();
result.material.shading = THREE.FlatShading;
result.geometry.computeVertexNormals();
Update
I have created a jsfiddle, but it is difficult to reproduce the error, I couldnt make it visible there: http://jsfiddle.net/L0rdzbej/23/
However, you can see the full application here.
Like #gaitat suggested, geometry.mergeVertices() does not look like it changes anything for me. Chandler Prall hinted at the source where precisionPoints, which is a variable inside the mergeVertices function, could solve this. Depending on the scale of the scene its value should be lower or negative, but I had no success so far.

How to place my logo on my map without white fields?

I have a problem with the my picturebox.
I want to place it on my map which i got in my program.
That works when i put my img in a Picturebox and then BringToFront();
i wanted to add a picture with the problem, but i just started at StackOverflow and doesnt have enough reputation yet... :(
Anyway my img got displayed but with the white fields around it. The img I use doenst have these white stuff around it.
How can i make my Picturebox transparent so that the white fields get removed.
LogoBox.Location = new Point(size.Width - 340, size.Height - 100);
LogoBox.Image = Properties.Resources.Troepoet;
LogoBox.Size = new System.Drawing.Size(250, 40);
LogoBox.SizeMode = PictureBoxSizeMode.StretchImage;
LogoBox.BackColor = Color.Transparent;
I tried to do it with only drawing a bitmap aswell but then i cant see any possibility to place it on the map. The map is a 'dominant' control.
Any help/suggestions?
Thx.
You don't need to put a picturebox on the map to draw your logo.
The easy way is to override the Map OnPainOverlays() and draw whatever you want (Logo, Text, any shape) on the map or another way if you don't want to inherit the GmapControl and work with the GMapControl by drag & drop it over a form to handle the Paint() Event:
Private Sub GMapControl1_Paint(sender As Object, e As PaintEventArgs) Handles GMapControl1.Paint
Dim logo As Image = image.FromFile("C:\transparentLogo.png")
e.Graphics.DrawImage(logo, GMapControl1.Width - logo.Width - 5, 5, logo.Width, logo.Height)
End Sub
Make sure you make image transparent and save it as PNG format.

PySDL2: Renderer or Window Surface for handling colors and text?

My question concerns the sdl2.ext.Renderer mainly, and has to do with a problem I've encountered when trying to render sprites on a sdl2.ext.Window surface.
So right now, for coloring my background on an SDL2 Window, I make the following call:
White = sdl2.ext.Color(255,255,255)
class Background(sdl2.ext.SoftwareSpriteRenderSystem):
def __init__(self,window):
super(Background,self).__init__(window)
sdl2.ext.fill(self.surface,White)
This colors the surface of the Window with a white background. However, I also want to display text on the screen. This is done by creating a TextureSprite using the from_text method of the sdl2.ext.SpriteFactory class as follows:
Renderer = sdl2.ext.Renderer(W) # Creating Renderer
ManagerFont = sdl2.ext.FontManager(font_path = "OpenSans.ttf", size = 14) # Creating Font Manager
Factory = sdl2.ext.SpriteFactory(renderer=Renderer) # Creating Sprite Factory
Text = Factory.from_text("Unisung Softworks",fontmanager=ManagerFont) # Creating TextureSprite from Text
Renderer.copy(Text, dstrect= (0,0,Text.size[0],Text.size[1])) # Resizing the Texture to fit the text dimensions when rendered
The problem occurs when the event loop is run.
running = True
while running:
events = sdl2.ext.get_events()
for event in events:
if event.type == sdl2.SDL_QUIT:
running = False
break
if event.type == sdl2.SDL_MOUSEBUTTONDOWN:
pass
Renderer.copy(Text, dstrect= (0,0,Text.size[0],Text.size[1]))
Renderer.present() # Problem 1
W.refresh() # Problem 2
return 0
Full Example Paste
When both Renderer.present() and W.refresh() are called, I get a flickering effect from the screen. This seems to be because the Renderer overwrites the White colored Window surface, which then gets colored over again by the Window.refresh() call. This then repeats, resulting in a flickering mess.
I would like to know what I can do to solve this problem? Should I even be using both Window.refresh() and the Renderer at the same time? Is there a way to let one render the other? (Renderer renders background for example). If anyone can help me out, that would be great.
The problem is that you are using Window.refresh() along with a SDL_Renderer and a SoftwareSpriteSystem with SDL_Texture objects.
As outlined in the PySDL2 docs, this is likely to break (since software surface buffers get mixed with hardware-based textures) and you should avoid it. If you want to color the window background, you should use Renderer.clear(White) instead of your Background class and call it before copying the text via Renderer.copy():
Renderer.clear(White)
Renderer.copy(Text, dstrect= (0,0,Text.size[0],Text.size[1]))
Renderer.present()
Do not forget to change the color for your FontManager. By default it uses a white text color, so you should change it to e.g. your Red color:
ManagerFont = sdl2.ext.FontManager(font_path = "OpenSans.ttf", size = 14, color=Red)

Resources