How to create events using React Native - events
I'm making an application using React VR. If you don't know React VR, well it's based on React Native with some other components, includes Three.js and other stuff, specific for using WebVR.
I've making a component named NavigateButton. Below is my code:
import React from 'react';
import { AppRegistry, asset, StyleSheet, Pano, Text, View, VrButton, Sphere } from 'react-vr';
export class NavigateButton extends React.Component {
render() {
return (
<VrButton onClick={() => this.onNavigating()}>
<Sphere radius={0.5} widthSegments={10} heightSegments={10} style={{ color: "red" }} />
</VrButton>
);
}
onNavigating() { // This method must throw an event
console.log(this.props.to);
}
};
If the user clicks on the VrButton (this is like a HTML 5 button-tag but for VR with inside it, a sphere), an event must been raised to the place where I call the NavigateButton component. That's on code below:
import React from 'react';
import { AppRegistry, asset, StyleSheet, Pano, Text, View, VrButton, Sphere } from 'react-vr';
import { NavigateButton } from './components/nativateButton.js';
let room = asset('360 LR/inkom_hal.jpg');
export default class MainComp extends React.Component {
render() {
return (
<View>
<Pano source={asset('360 LR/inkom_hal.jpg')} />
<View style={{ transform: [{ translate: [20, 0, 0] }] }}>
<NavigateButton to="garage"></NavigateButton>
<!-- and must been catch here -->
</View>
<View style={{ transform: [{ translate: [-7, 0, -20] }] }}>
<NavigateButton to="woonkamer"></NavigateButton>
<!-- or here -->
</View>
</View>
);
}
}
AppRegistry.registerComponent('MainComp', () => MainComp);
Is it possible to do that? I would something like code below to catch the event:
<NavigateButton to="woonkamer" onNavigate={() => this.change()}></NavigateButton>
I've searched on the internet but nothing found that could help me.
Here is the instruction how to create Sample VR app with React VR prepared by me and my team:
Creating a VR tour for web
The structure of future app’s directory is as follows:
+-node_modules
+-static_assets
+-vr
\-.gitignore
\-.watchmanconfig
\-index.vr.js
\-package.json
\-postinstall.js
\-rn-cli-config.js
The code of a web app would be in the index.vr.js file, while the static_assets directory hosts external resources (images, 3D models). You can learn more on how to get started with React VR project here. The index.vr.js file contains the following:
import React from 'react';
import {
AppRegistry,
asset,
StyleSheet,
Pano,
Text,
View,
}
from 'react-vr';
class TMExample extends React.Component {
render() {
return (
<View>
<Pano source={asset('chess-world.jpg')}/>
<Text
style={{
backgroundColor:'blue',
padding: 0.02,
textAlign:'center',
textAlignVertical:'center',
fontSize: 0.8,
layoutOrigin: [0.5, 0.5],
transform: [{translate: [0, 0, -3]}],
}}>
hello
</Text>
</View>
);
}
};
AppRegistry.registerComponent('TMExample', () => TMExample);
VR components in use
We use React Native packager for code pre-processing, compilation, bundling and asset loading. In render function there are view, pano and text components. Each of these React VR components comes with a style attribute to help control the layout.
To wrap it up, check that the root component gets registered with AppRegistry.registerComponent, which bundles the application and readies it to run. Next step to highlight in our React VR project is compiling 2 main files.
Index.vr.js file
In constructor we’ve indicated the data for VR tour app. These are scene images, buttons to switch between scenes with X-Y-Z coordinates, values for animations. All the images we contain in static_assets folder.
constructor (props) {
super(props);
this.state = {
scenes: [{scene_image: 'initial.jpg', step: 1, navigations: [{step:2, translate: [0.73,-0.15,0.66], rotation: [0,36,0] }] },
{scene_image: 'step1.jpg', step: 2, navigations: [{step:3, translate: [-0.43,-0.01,0.9], rotation: [0,140,0] }]},
{scene_image: 'step2.jpg', step: 3, navigations: [{step:4, translate: [-0.4,0.05,-0.9], rotation: [0,0,0] }]},
{scene_image: 'step3.jpg', step: 4, navigations: [{step:5, translate: [-0.55,-0.03,-0.8], rotation: [0,32,0] }]},
{scene_image: 'step4.jpg', step: 5, navigations: [{step:1, translate: [0.2,-0.03,-1], rotation: [0,20,0] }]}],
current_scene:{},
animationWidth: 0.05,
animationRadius: 50
};
}
Then we’ve changed the output of images linking them to state, previously indicated in constructor.
<View>
<Pano source={asset(this.state.current_scene['scene_image'])}
style={{
transform: [{translate: [0, 0, 0]}]
}}/>
</View>
Navigational buttons
In each scene we’ve placed transition buttons for navigation within a tour, taking data from state. Subscribing to onInput event to convey switching between scenes, binding this to it as well.
<View>
<Pano source={asset(this.state.current_scene['scene_image'])} onInput={this.onPanoInput.bind(this)}
onLoad={this.sceneOnLoad} onLoadEnd={this.sceneOnLoadEnd}
style={{ transform: [{translate: [0, 0, 0]}] }}/>
{this.state.current_scene['navigations'].map(function(item,i){
return <Mesh key={i}
style={{
layoutOrigin: [0.5, 0.5],
transform: [{translate: item['translate']},
{rotateX: item['rotation'][0]},
{rotateY: item['rotation'][1]},
{rotateZ: item['rotation'][2]}]
}}
onInput={ e => that.onNavigationClick(item,e)}>
<VrButton
style={{ width: 0.15,
height:0.15,
borderRadius: 50,
justifyContent: 'center',
alignItems: 'center',
borderStyle: 'solid',
borderColor: '#FFFFFF80',
borderWidth: 0.01
}}>
<VrButton
style={{ width: that.state.animationWidth,
height:that.state.animationWidth,
borderRadius: that.state.animationRadius,
backgroundColor: '#FFFFFFD9'
}}>
</VrButton>
</VrButton>
</Mesh>
})}
</View>
onNavigationClick(item,e){
if(e.nativeEvent.inputEvent.eventType === "mousedown" && e.nativeEvent.inputEvent.button === 0){
var new_scene = this.state.scenes.find(i => i['step'] === item.step);
this.setState({current_scene: new_scene});
postMessage({ type: "sceneChanged"})
}
}
sceneOnLoad(){
postMessage({ type: "sceneLoadStart"})
}
sceneOnLoadEnd(){
postMessage({ type: "sceneLoadEnd"})
}
this.sceneOnLoad = this.sceneOnLoad.bind(this);
this.sceneOnLoadEnd = this.sceneOnLoadEnd.bind(this);
this.onNavigationClick = this.onNavigationClick.bind(this);
Button animation
Below, we’ll display the code for navigation button animations. We’ve built animations on button increase principle, applying conventional requestAnimationFrame.
this.animatePointer = this.animatePointer.bind(this);
animatePointer(){
var delta = this.state.animationWidth + 0.002;
var radius = this.state.animationRadius + 10;
if(delta >= 0.13){
delta = 0.05;
radius = 50;
}
this.setState({animationWidth: delta, animationRadius: radius})
this.frameHandle = requestAnimationFrame(this.animatePointer);
}
componentDidMount(){
this.animatePointer();
}
componentWillUnmount(){
if (this.frameHandle) {
cancelAnimationFrame(this.frameHandle);
this.frameHandle = null;
}
}
In componentWillMount function we’ve indicated the current scene. Then we’ve also subscribed to message event for data exchange with the main thread. We do it this way due to a need to work out a React VR component in a separate thread.
In onMainWindowMessage function we only process one message with newCoordinates key. We’ll elaborate later why we do so. Similarly, we’ve subscribed to onInput event to convey arrow turns.
componentWillMount(){
window.addEventListener('message', this.onMainWindowMessage);
this.setState({current_scene: this.state.scenes[0]})
}
onMainWindowMessage(e){
switch (e.data.type) {
case 'newCoordinates':
var scene_navigation = this.state.current_scene.navigations[0];
this.state.current_scene.navigations[0]['translate'] = [e.data.coordinates.x,e.data.coordinates.y,e.data.coordinates.z]
this.forceUpdate();
break;
default:
return;
}
}
<Pano source={asset(this.state.current_scene['scene_image'])} onInput={this.onPanoInput.bind(this)}
style={{ transform: [{translate: [0, 0, 0]}] }}/>
rotatePointer(nativeEvent){
switch (nativeEvent.keyCode) {
case 38:
this.state.current_scene.navigations[0]['rotation'][1] += 4;
break;
case 39:
this.state.current_scene.navigations[0]['rotation'][0] += 4;
break;
case 40:
this.state.current_scene.navigations[0]['rotation'][2] += 4;
break;
default:
return;
}
this.forceUpdate();
}
Arrow turns are done with ↑→↓ alt keys, for Y-X-Z axes respectively.
See and download the whole index.vr.js file on Github HERE.
Client.js file
Moving further into our React VR example of virtual reality web applications, we’ve added the code below into init function. The goal is processing of ondblclick, onmousewheel and message events, where the latter is in rendering thread for message exchanges. Also, we’ve kept a link to vr and vr.player._camera objects.
window.playerCamera = vr.player._camera;
window.vr = vr;
window.ondblclick= onRendererDoubleClick;
window.onmousewheel = onRendererMouseWheel;
vr.rootView.context.worker.addEventListener('message', onVRMessage);
We’ve introduced the onVRMessage function for zoom returning to default when scenes change. Also, we have added the loader when scene change occurs.
function onVRMessage(e) {
switch (e.data.type) {
case 'sceneChanged':
if (window.playerCamera.zoom != 1) {
window.playerCamera.zoom = 1;
window.playerCamera.updateProjectionMatrix();
}
break;
case 'sceneLoadStart':
document.getElementById('loader').style.display = 'block';
break;
case 'sceneLoadEnd':
document.getElementById('loader').style.display = 'none';
break;
default:
return;
}
}
onRendererDoubleClick function for 3D-coordinates calculation and sending messages to vr component to change arrow coordinates. The get3DPoint function is custom to our web VR application and looks like this:
function onRendererDoubleClick(){
var x = 2 * (event.x / window.innerWidth) - 1;
var y = 1 - 2 * ( event.y / window.innerHeight );
var coordinates = get3DPoint(window.playerCamera, x, y);
vr.rootView.context.worker.postMessage({ type: "newCoordinates", coordinates: coordinates });
}
Switch to mouse wheel
We’ve used the onRendererMouseWheel function for switching zoom to a mouse wheel.
function onRendererMouseWheel(){
if (event.deltaY > 0 ){
if(window.playerCamera.zoom > 1) {
window.playerCamera.zoom -= 0.1;
window.playerCamera.updateProjectionMatrix();
}
}
else {
if(window.playerCamera.zoom < 3) {
window.playerCamera.zoom += 0.1;
window.playerCamera.updateProjectionMatrix();
}
}
}
Exporting coordinates
Then we’ve utilized Three.js to work with 3D-graphics. In this file we’ve only conveyed one function to export screen coordinated to world coordinates.
import * as THREE from 'three';
export function get3DPoint(camera,x,y){
var mousePosition = new THREE.Vector3(x, y, 0.5);
mousePosition.unproject(camera);
var dir = mousePosition.sub(camera.position).normalize();
return dir;
}
See and download the whole client.js file on Github HERE. There’s probably no need to explain how the cameraHelper.js file works, as it is plain simple, and you can download it as well.
Also, if you are interested in a lookalike project estimate or same additional technical details about ReactVR development - you can find some info here:
Related
Three.js drag a model on x and z axis. React three fiber
I am trying to make models draggable in three.js. I want my model to follow my mouse when I move it. This is what I am trying to accomplish. I am using react-three-fiber and #use-gesture/react What I am trying to accomplish Here is how my program looks My program The difference is quite noticeable. On the good example, the model follows the mouse wherever it goes. On my program, that is not the case. Here is my code for the cube const BasicOrangeBox = ({setControlsDisabled, startPosition} : BasicOrangeBoxType) => { const { camera } = useThree(); const [boxPosition, setBoxPosition] = useState(startPosition) const bind = useGesture({ onDrag: ({movement: [x, y]}) => { setControlsDisabled(true); setBoxPosition( (prev) => { const newObj = {...prev}; newObj.x = newObj.x0 + (x/100); newObj.z = newObj.z0 + (y/100); return newObj; } ) }, onDragEnd: () => { setControlsDisabled(false); setBoxPosition( (prev) => { const newObj = {...prev}; newObj.x0 = newObj.x; newObj.z0 = newObj.z; return newObj; } ) } }) return ( <mesh {...bind() } position={[boxPosition.x, boxPosition.y, boxPosition.z]} > <boxGeometry /> <meshBasicMaterial color={"orange"} /> </mesh> ) }
Here is how I made a mesh cube draggable only on x and z axis like in a video. Needed packages: three react-three/fiber use-gesture/react First, I created a plane that spanned across my whole viewport and assigned it to a useRef <mesh rotation={[MathUtils.degToRad(90), 0, 0]} ref={planeRef} position={[0, -0.01, 0]}> <planeGeometry args={[innerWidth, innerHeight]} /> <meshBasicMaterial color={0xfffffff} side={DoubleSide} /> </mesh> Then I added that ref to a useContext so I can use it in different components. Next, I imported raycaster from useThree hook and planeRef from aforementioned useContext. Then I used useGesture onDrag and onDragEnd to enable and disable my OrbitControls Inside the onDrag, I used raycaster's intersectsObject method and added an array of only one element, my plane, as a parameter. This gave me x, y, z coordinates where my mouse intersects with the plane. (Y is always 0) Then I updated my box position. Here is the full code snippet const BasicOrangeBox = ({setControlsDisabled, startPosition} : BasicRedBoxType) => { const { raycaster } = useThree(); const [boxPosition, setBoxPosition] = useState(startPosition); const planeRef = useContext(PlaneContext); const bind = useGesture({ onDrag: () => { const intersects = raycaster.intersectObjects([planeRef]); if (intersects.length > 0){ const intersection = intersects[0]; console.log(intersection.point.x); setBoxPosition({ x: intersection.point.x, y: intersection.point.y, z: intersection.point.z, }) } setControlsDisabled(true); }, onDragEnd: () => { setControlsDisabled(false); } }) return ( //#ts-ignore Ignores type error on next line <mesh {...bind() } position={[boxPosition.x, boxPosition.y, boxPosition.z]} > <boxGeometry /> <meshBasicMaterial color={"orange"} /> </mesh> ) }
FAB Menu Nativescript with Angular
I'm building an app with angular and nativescript , and i want a fab button like this one fab menu vuejs Does anyone have example or snippet for angular ? I am not very good with css and i don't know how to something like on angular.
Note: I'm also very new to Nativescript/Angular. I might miss some details, feel free to edit this answer to correct me. I used this so I would not have to make the FAB myself: https://market.nativescript.org/plugins/nativescript-floatingactionbutton. You can add it to your project by running tns plugin add nativescript-floatingactionbutton. I don't feel like the documentation is very clear.. I went through those links to come up with something: https://github.com/nstudio/nativescript-floatingactionbutton/issues/95 (your question basically) https://github.com/jlooper/nativescript-snacks/blob/master/_posts/2016-06-05-fab-nav-angular.markdown (linked in the last answer of the previous link.. outdated.. removed from the doc) https://docs.nativescript.org/angular/ui/animation First, the layout of my page is a GridLayout. I feel like it won't work otherwise. I was testing with a StackLayout first.. no luck. Inside this GridLayout, I have other stuff (in my case a ListView) and I added at the end another GridLayout. <GridLayout rows="auto, *"> ... <GridLayout row="1", rows="auto, *"> <Fab row="1" #btna icon="res://first_option_icon" rippleColor="#f1f1f1" class="fab-button btna"></Fab> <Fab row="1" #btnb icon="res://second_option_icon" rippleColor="#f1f1f1" class="fab-button btnb"></Fab> <Fab row="1" #fab (tap)="displayOptions()" icon="res://add_icon" rippleColor="#f1f1f1" class="fab-button"></Fab> </GridLayout> </GridLayout> In the exemple from github, buttons are used instead of fab for the "children". The only reason I replaced it with a fab here is because I download my icons from https://material.io/resources/icons and buttons don't accept icons (when downloading an icon from material.io, you can choose "android" (or iOS) in the download options, it gives different sizes of the icon). Using fab instead of buttons, the css becomes a little easier as well (unless you want to make them smaller). .fab-button { height: 70; /*width: 70; -- Needed for iOS only*/ margin: 15; background-color: orangered; horizontal-align: right; vertical-align: bottom; } .btna { background-color: #493DF8; } .btnb { background-color: #1598F6; } And then all that's left is the javascript. // Necessary imports import { ..., ViewChild, ElementRef } from "#angular/core"; import { registerElement } from "#nativescript/angular/element-registry"; import { Fab } from "nativescript-floatingactionbutton"; import { View } from "tns-core-modules"; registerElement("Fab", () => Fab); #Component(...) export class YourComponent { ... // Reference those fabs public _isFabOpen: Boolean; #ViewChild("btna") btna: ElementRef; #ViewChild("btnb") btnb: ElementRef; #ViewChild("fab") fab: ElementRef; ... displayOptions() { if (this._isFabOpen) { // Rotate main fab const fab = <View>this.fab.nativeElement; fab.animate({rotate: 0, duration: 280, delay: 0}); // Show option 1 const btna = <View>this.btna.nativeElement; btna.animate({translate: { x: 0, y: 0 }, opacity: 0, duration: 280, delay: 0}); // Show option 2 const btnb = <View>this.btnb.nativeElement; btnb.animate({translate: { x: 0, y: 0 }, opacity: 0, duration: 280, delay: 0}); this._isFabOpen = false; } else { // Rotate main fab const view = <View>this.fab.nativeElement; view.animate({rotate: 45, duration: 280, delay: 0}); // Show option 1 const btna = <View>this.btna.nativeElement; btna.animate({translate: { x: 0, y: -80 }, opacity: 1, duration: 280, delay: 0}); // Show option 2 const btnb = <View>this.btnb.nativeElement; btnb.animate({translate: { x: 0, y: -145 }, opacity: 1, duration: 280, delay: 0}); this._isFabOpen = true; } } } Tada!
Prebuilding a "heavy" component?
I've got a component, a custom keyboard, that takes about 1 second to show/build in React Native. Is it possible to prebuild this component, say on start up, and have it appear instantly when I need to show it?
The only way to do that is to render it ahead of time. You can easily do this by rendering with zero opacity or translateX out of screen, and make it visible by changing those properties. Example: const styles = StyleSheet.create({ invisible: { opacity: 0, transform: [ {translateX: -3000} ] } }) const MyHeavyComponent = ({isVisible, ...props}) => { const visiblityStyle = isVisible ? null : styles.invisible; return ( <View style={visiblityStyle}> ... </View> ) }
Making SVG Responsive in React
I am working on a responsive utility component, to make a few D3 components responsive in react. However I deep SVG knowledge escapes me. I have based my responsive utility on this issue on github. However it isn't quite working, All it does is render the a chart, but not at the width or height passed in but rather at a really small width and height. It also doesn't resize. import React from 'react'; class Responsive extends React.Component{ constructor () { super(); this.state = { size: { w: 0, h: 0 } } } componentDidMount () { window.addEventListener('resize', this.fitToParentSize.bind(this)); this.fitToParentSize(); } componentWillReceiveProps () { this.fitToParentSize(); } componentWillUnmount() { window.removeEventListener('resize', this.fitToParentSize.bind(this)); } fitToParentSize () { let elem = this.findDOMNode(this); let w = elem.parentNode.offsetWidth; let h = elem.parentNode.offsetHeight; let currentSize = this.state.size; if (w !== currentSize.w || h !== currentSize.h) { this.setState({ size: { w: w, h: h } }); } } render () { let {width, height} = this.props; width = this.state.size.w || 100; height = this.state.size.h || 100; var Charts = React.cloneElement(this.props.children, { width, height}); return Charts; } }; export default Responsive; Responsive width={400} height={500}> <XYAxis data={data3Check} xDataKey='x' yDataKey='y' grid={true} gridLines={'solid'}> <AreaChart dataKey='a'/> <LineChart dataKey='l' pointColor="#ffc952" pointBorderColor='#34314c'/> </XYAxis> </Responsive>
disclaimer: I'm the author of vx a low-level react+d3 library full of visualization components. You could use #vx/responsive or create your own higher-order component based on withParentSize() or withWindowSize() depending on what sizing you want to respond to (I've found most situations require withParentSize()). The gist is you create a higher-order component that takes in your chart component and it attaches/removes event listeners for when the window resizes with a debounce time of 300ms by default (you can override this with a prop) and stores the dimensions in its state. The new parent dimensions will get passed in as props to your chart as parentWidth, parentHeight or screenWidth, screenHeight and you can set your svg's width and height attributes from there or calculate your chart dimensions based on those values. Usage: // MyChart.js import { withParentSize } from '#vx/responsive'; function MyChart({ parentWidth, parentHeight }) { return ( <svg width={parentWidth} height={parentHeight}> {/* stuff */} </svg> ); } export default withParentSize(MyChart);
change button image
I have created a new custom button, can I set a different background image instead of 'circle' or 'triangle' ? thanks Chanan exporting: { enabled: true, buttons: { 'realTimeButton': { id: 'realTimeButton', symbol: 'diamond', x: -88, symbolFill: realTimeColor, hoverSymbolFill: realTimeColor, _titleKey: "realTimeButtonTitle", onclick: function(event) { // handle change to real time if ( enable_lastRoundsChart_realtime ) { // disable real time flag enable_lastRoundsChart_realtime = 0; // re-create detail in real time mode disabled createDetail(cache_last_rounds.last_rounds_data, window.show_top_round_ids); // enable plotBand if ( pb_master_chart ) { pb_master_chart.options.chart.events.selection.enabled = 'true'; pb_master_chart.options.chart.zoomType = 'x'; } } else { // enable real time flag enable_lastRoundsChart_realtime = 1; // re-create detail in real time mode enabled createDetail(cache_last_rounds.last_rounds_data, window.show_top_round_ids); // update title this.setTitle({text:"Players/Drops Per Round"}, {text:"Real Time"}); // if master found, remove plotBand and disable master selection if ( pb_master_chart ) { // remove plotBand pb_master_chart.xAxis[0].removePlotBand('mask-before'); pb_master_chart.xAxis[0].removePlotBand('mask-after'); pb_master_chart.xAxis[0].addPlotBand({ id: 'mask-before', from: -1, to: 99999, color: 'rgba(0, 0, 0, 0.2)' }) // disable selection pb_master_chart.options.chart.events.selection.enabled = 'false'; pb_master_chart.options.chart.zoomType = null; } } } },
According to the docs, the shapes are defined in the Highcharts.Renderer.symbols collection. Inspecting this object reveals the following available shapes: Highcharts.Renderer.prototype.symbols: arc: function (a,b,c,d,e){var f=e.start,c=e.r||c||d,g=e.end-1.0E-6,d=e.innerR,h=e.open,i=W(f),j=Z(f),k=W(g),g=Z(g),e=e.end-f<Aa?0:1;return["M",a+c*i,b+c*j,"A",c,c, circle: function (a,b,c,d){var e=0.166*c;return["M",a+c/2,b,"C",a+c+e,b,a+c+e,b+d, diamond: function (a,b,c,d){return["M",a+c/2,b,"L",a+c,b+d/2,a+c/2,b+d,a,b+d/2,"Z"]} exportIcon: function (a,b,c,d){return y(["M",a,b+c,"L",a+c,b+d,a+c,b+d*0.8,a,b+d*0.8,"Z","M",a+c*0.5,b+d*0.8,"L",a+c*0.8,b+d*0.4,a+c*0.4,b+d*0.4,a+c*0.4,b,a+c*0.6,b,a+c*0.6,b+d*0.4,a+c*0.2,b+d*0.4,"Z"])} printIcon: function (a,b,c,d){return y(["M",a,b+d*0.7,"L",a+c,b+d*0.7,a+c,b+d*0.4,a,b+d*0.4,"Z","M",a+c*0.2,b+d*0.4,"L",a+c*0.2,b,a+c*0.8,b,a+c*0.8,b+d*0.4,"Z","M",a+c*0.2,b+d*0.7,"L",a,b+d,a+ square: function (a,b,c,d){return["M",a,b,"L",a+c,b,a+c,b+d,a,b+d,"Z"]} triangle: function (a,b,c,d){return["M",a+c/2,b,"L",a+c,b+d,a,b+d,"Z"]} triangle-down: function (a,b,c,d){return["M",a,b,"L",a+c,b,a+c/2,b+d,"Z"]} You can also add your own symbol by extending the collection. For example, drawing a simple X: $.extend(Highcharts.Renderer.prototype.symbols, { anX: function (a,b,c,d){return["M",a,b,"L",a+c,b+d,"M",a+c,b,"L",a,b+d]} }); Produces: Fiddle here.
You have ability to set image as icon http://jsfiddle.net/Udgb3/ symbol: 'url(http://highcharts.com/demo/gfx/sun.png)', symbolX:5, symbolY:0