SCNAction.rotateByAngle rotation is inverted in El Capitan or Metal - rotation

Using SCNAction.rotateByAngle(…) in my game I would press left/right/up/down keys or swipe to have an object rotate in that direction. But testing my game on El Capitan or with Metal as the renderer causes the 3D object to rotate the other way i.e. left becomes right and up becomes down.
I haven't found any documentation mentioning that the rotation is "inverted" or "reversed" in El Capitan or Metal.
The code is:
Rotate up = SCNVector3(x:1, y:0, z:0)
Rotate down = SCNVector3(x:-1, y:0, z:0)
Rotate left = SCNVector3(x:0, y:1, z:0)
Rotate right = SCNVector3(x:0, y:-1, z:0)
SCNAction.rotateByAngle(CGFloat(M_PI_2), aroundAxis:vector, duration:1)
Pretty simple and straight forward.
Any clues why this is happening?
Should I have to check which OS is running or Metal and then apply the correct rotations?
thx

In the Apple Developer Forums, an Apple employee has confirmed this.
Apple Developer Forum question & answer
This is right. The behavior on rotateByAngle was wrong (inverted)
before 10.11 and iOS9 and it is fixed now. The backward compatibility
is ensured (an application linked before iOS9 / 10.11 will continue to
run the same). If you are building a new application and using this
API and want to support previous builds, then yes you will have to
check the version of the OS you are running. Another option is to use
the "rotateByX:y:z:duration:" variant.

Related

Vertical takeoff for a DJI Matrice 100

DJI Android SDK version: 4.11
Matrice 100 / Matrice 600
I am trying to take off the drone vertically.
I tried with GoToAction in a timeline, but that failed due to some bug in the SDK, (confirmed by your support team dev#dji.com #29496) I get STARTED for the GoToAction, but no PROGRESSED or FINISHED, and no errors logged at all.
Since I need to continue working, I tried a workaround by sending FlightControlData to the VirtualStick by calling the following function with the requested height 20 times a second:
VerticalControlMode.POSITION
FlightOrientationMode.AIRCRAFT_HEADING
VirtualStickModeEnabled = true
VirtualStickAdvancedModeEnabled = true
void sendHeightCommand(Float requestedAltitude) {
FlightControlData data = new FlightControlData(0f, 0f, 0f, requestedAltitude);
flightController.sendVirtualStickFlightControlData(data, djiError -> {
log.v(djiError.getDescription);
});
}
And it works (with the right amount of timeouts) but if there is wind, the drone drifts away, which is very dangerous for me as there is more than one drone in the field, and I don't want them to collide.
Is there another way to change the altitude of the drone, while maintaining its position?
Or is there a way to measure the wind, and push back against it?
[*] Take off drone vertically:
I always use the TakeOffAction in the timelineMission before the GoToAction to ascend to the desired height. However I'm using a Mavic Pro and the SDK may behave different with a matrice drone.
When using the FlightControlData with the VirtualSticks, I use the startPrecisionTakeoff() method in the FlightController class; after the takeoff, the drone ascends to the desired position when the flight control data is sent continuously.
[*] stable hovering:
For the hovering the only low cost solution I see is to enable the VisionAssistedPositioning in the FlightAssistant class, I don't know if the Matrice supports this feature as the documentation doesn't say anything on the supported aircrafts.
Ok so the solution was to use the function: setVirtualStickAdvancedModeEnabled(true)
The reason I didn't see any results was because in the simulator I was playing with 20.0 North Wind, which apparently is too much.
When I lowered it to 5.0 it works perfectly.

Why three.js pointcloud is not same rendering between mac and windows (Intel HD graphics 4000)?

I'm developing Huge PointCloud Viewer(over 10millions) using Three.js.
But, I got a strange result - not same rendering between mac and windows.
below figure is on Mac
On Mac, and next figureOn windows on Windows(7).
both uses Intel HD graphics 4000.
What's happens in Chrome Browser?
Addtionally informations: same situation are iPhoneSE, iPhoneX, iPad4, MacBook, MacBookAir, and MacBookPro. those machines display very sparsed point cloud(intel HD graphics series commonly)
But, only iMac(2017) displays Huge point cloud, successfully. It uses Radeon pro 555, not intel GPU.
I want to any message about info or/and err, but no error in "chrome_debug.log"
=== P.S. === below my code
if(data.point.position.length>0){
var geometry = new THREE.BufferGeometry();
geometry.addAttribute('position', new THREE.BufferAttribute(data.point.position, 3));
geometry.addAttribute('color', new THREE.BufferAttribute(data.point.color, 3));
var material = new THREE.PointsMaterial({
vertexColors: THREE.VertexColors,
size:0.8,
sizeAttenuation : true,
});
}
=== P.P.S. ===
For all
Try and Error, before I may find resolving it.
When pointMaterial.sizeAttenuation = false, FAR perspective view on Mac like on Windows. However, NEAR perspective view become sparse point cloud.
But if NEAR perspective, creating with pointMaterial.sizeAttenuation = true, I got better than before result.
Thanks a lot for your suggestion.
On both the working and the failing configuration, visit to http://webglreport.com/?v=1 and check the value of the OES_element_index_uint extension. My assumption is that this extension not supported on the failing machine/OS combination (driver support missing on MacOS).
This extension is required by three.js to render more than 64k vertices from a single BufferGeometry. On machines that don't support this extension, you'll need to split up your geometry.
Your problem is pixel density on Mac devices. Mac devices often have a pixel ratio of 2 or higher on retina displays. When you declare pointsMaterial.size, you're telling it how big each point is in pixels, so make sure you're taking window.devicePixelRatio into account when assigning their pixel size. Something like
pointsMaterial.size = mySize * window.devicePixelRatio;
should do the trick.
For Chrome users, disabling "Use hardware acceleration when available" could solve the sparse point cloud problem, but the fps may be very low.
I didn't dig into the further reason, if someone finds it, Pls let me know~

Using core text on Mac book pro / HD screen different text sizes

I've recently shifted to a MacBook Pro 15" and core text is behaving strangely.
I have an app that uses core text to draw an NSattributedstring with CTFrameDraw. It works as expected on the external 1080p monitor, if I drag the window across to the MacBook Pro screen then the font is displayed very tiny like it's changed from 10 point font to 5 point when painted. Likewise if I repaint the text on the MacBook Pro then it's still small.
I'd guess it's because the MacBook Pro has the high resolution screen, and the font is being rendered to the native pixel resolution. Could anyone point me to docs on how to handle this? I had a google around and came up empty.
Swift 3, Xcode 8.2.1 on OSX 10.12.2
tia
It was me. I had this bit of code
var c = context.ctm
c = c.inverted()
context.concatenate(c)
to set the transformation matrix back to the identity. It screws things up on the double pixel displays.

Swift 2 SpriteKit issues

So I am having some issues with my spriteKit game since upgrading to iOS 9 and even after upgrading to Swift 2. I mentioned 1 here Atlas images wrong size on iPad iOS 9
However I am having 2 more issues I cannot fix.
1)
All my particle effects dont work anymore. This is the code I use and they are just not showing up anymore. If I just use SKEmitterNode than it works, but I prefer to add the SKEmitterNode to a SKEffectNode as it blends much better with backgrounds etc.
This is the code.
let particlesPath = NSBundle.mainBundle().pathForResource("Thrusters", ofType: "sks")!
let particles = NSKeyedUnarchiver.unarchiveObjectWithFile(particlesPath) as! SKEmitterNode
let particleEffects = SKEffectNode() //blends better with background when moving
particleEffects.zPosition = 20
particleEffects.position = CGPointMake(0, -50)
particleEffects.addChild(particles)
addChild(particleEffects)
I read this
http://forum.iphonedev.tv/t/10-8-skeffectnode-or-xcode-7-or-my-issue/669
and it claims it was fixed, but it wasn't.
2)
My Game Center banners when I log in or when an achievement pops are now using the portrait banner, even though my game is in landscape. Therefore banners only cover half the top screen. It looks so bad and since there is no actually code for banners I dont even know where to start.
Anyone else facing these issues, its frustrating.
Thanks for any help or support.
Some updates to this old question. Believe it or not in regards to the particles, apple recently replied to my 1 year old BugReport to see if it is fixed in iOS 10. LOL
I heard rendering particles via a SKEffectNode way is not necessarily ideal in regards to performance and I am not using it anymore. Therefore I am not sure if the bug is still occurring with the later Xcode and iOS 9 updates or in iOS 10 beta.
In regards to the adMob banner, I simply had to change
let adMobBannerAdView = GADBannerView()
to
var adMobBannerAdView: GADBannerView?
and delay initialisation until ViewDidLoad/DidMoveToView.

How can I capture the screen with Haskell on Mac OS X?

How can I capture the screen with Haskell on Mac OS X?
I've read Screen capture in Haskell?. But I'm working on a Mac Mini. So, the Windows solution is not applicable and the GTK solution does not work because it only captures a black screen. GTK in Macs only captures black screens.
How can I capture the screen with … and OpenGL?
Only with some luck. OpenGL is primarily a drawing API and the contents of the main framebuffer are undefined unless it's drawn to by OpenGL functions themself. That OpenGL could be abused was due to the way graphics system did manage their on-screen windows' framebuffers: After a window without predefined background color/brush was created, its initial framebuffer content was simply everything that was on the screen right before the window's creation. If a OpenGL context is created on top of this, the framebuffer could be read out using glReadPixels, that way creating a screenshot.
Today window compositing has become the norm which makes abusing OpenGL for taking screenshots almost impossible. With compositing each window has its own off-screen framebuffer and the screen's contents are composited only at the end. If you used that method outlined above, which relies on uninitialized memory containing the desired content, on a compositing window system, the results will vary wildly, between solid clear color, over wildly distorted junk fragments, to data noise.
Since taking a screenshot reliably must take into account a lot of idiosyncrasy of the system this is to happen on, it's virtually impossible to write a truly portable screenshot program.
And OpenGL is definitely the wrong tool for it, no matter that people (including myself) were able to abuse it for such in the past.
I programmed this C code to capture the screen of Macs and to show it in an OpenGL window through the function glDrawPixels:
opengl-capture.c
http://pastebin.com/pMH2rDNH
Coding the FFI for Haskell is quite trivial. I'll do it soon.
This might be useful to find the solution in C:
NeHe Productions - Using gluUnProject
http://nehe.gamedev.net/article/using_gluunproject/16013/
Apple Mailing Lists - Re: Screen snapshot example code posted
http://lists.apple.com/archives/cocoa-dev/2005/Aug/msg00901.html
Compiling OpenGL programs on Windows, Linux and OS X
http://goanna.cs.rmit.edu.au/~gl/teaching/Interactive3D/2012/compiling.html
Grab Mac OS Screen using GL_RGB format

Resources