How to get CATransform3D from Projection and ModelView matrices
I have browse this question before,the url;but it wasn't absolutely precise; I want to know why it’s not precise;
I find that the rotate by x--axis and y--axis is opposite(I can't post image);the Trapezoidal(when rotate the calayer , it's like a Trapezoidal) is opposite,dose calayer rotate is not like opengl?but it's right when rotate by z--axis.
I want get the Angle and then rotate it twice opposite;how to get Angle by the opengl matrices?
I want play movie by QCAR; I can get opengl matrices,and I want to use UIView to playe movie and other(Familiar with the UIView),and make UIview like Texture.
I just post a example on QCAR forum, you can check the detail here
The idea is simple.
1.I don't use avassetreader to read the video pixel, instead I use AVPlayer and AVPlayerLayer which can play remote video file.
2.I need to convert opengl modelview matrix to CATransform3D so the AVPlayerLayer will attach on the trackable image.Thanks for Hammer on stackoverflow, he shared an example about how to make this.
3.I tried to render camera background use opengl and AVPlayerLayer at same time, but the performace is not good enough, so I use another calayer to render the camera background. There is a bug in 1.5.8 when getting camera frame, thank for andersfrank the problem was solved here.
I put my code on pastebin:
http://pastebin.com/kmeVZ7jy
There are still some problem need to solve, but I think this is a right approach.
PS. For question 3 in my original post, you can fix it by set
cameraLayer.contentsGravity = kCAGravityResizeAspectFill;
This will fix the problem that the aspect ratio of the camera image is different with the iphone screen's.
Update
All the problem are solved, please check my original post on vuforia and pastebin.
Related
I textured an object in Blender because it wouldn't texture properly in Unity, and then imported the object and texture to Unity.
I don't know how to fix this, I'll put both pictures here.
Blender Texture Before Import
Object In Unity
Okay, so based on your screenshots..
You're going to select everything then add a modifier called "Solidify", then set this to something very small like .03. (Unity doesn't like objects that are just planes).
double check all your normals are facing out. Let me know if you don't know how to do this...
Go into edit mode select all edges, then right click, select "mark as seam".
Open UV editor window (should be split screen with Edit mode on one side, UV view on the other). In the Edit side select all, then go to UV dropdown menu and click "unwrap". You should then see your object unfolded into flat planes over on the UV window side. There's different unwrap options, like smart UV unwrap, etc. I think just "unwrap" has worked for me, but play around and there may be something that shows your object shapes in a less distorted way..
At this point, since your pattern is basically repeating. If you export the OBJ file and take it into unity, and you add the image file (make sure it's dimensions are perfectly square) it should receive the image file as an Albedo texture much better than in your screenshots. You can play with the 'tiling' and "X/Y offset" till it looks right (you might face issues with rotation though.
BUT
If you want to line it up very specifically, you can export the UV layout as a png from the UV window in blender. Then use photoshop or another photo editor to change/rotate and arrange your texture so the sides line up properly. In blender in the Edit window (assuming you still have both UV + editor open) when you select a side it will highlight in the UV window the corresponding flat plane, based on this you should be able to figure out what should be rotated up/down, etc. Then when you change that 2d image and drag it into Unity, it will adjust and wrap around the object.
I'm pretty new to both, but the advice I've been given is to not do the texturing in blender, but instead to do it in Unity.
This is a 10-month-old post but incase anyone is curious or struggling the same, Blender exports models with a scale of 100 so you need to scale up the tile of the material (in material settings) to see it.
This is a bad solution however because then you are not working with objects on a scale of 1 so you actually want to check "Apply transformations" when exporting the FBX model in blender
I tried to use MPFoldTransition for my UIView animation. But I want it like flipboard. That is, it should be animated as per the user touch.
Please help me that how can I achieve it. Also, is there any other library for UIView animation like flipboard.
If you want to use a library for this flip, MPFlipViewController uses MPFoldTransition and works with a pan gesture.
Also, take a look at https://github.com/ITechRoof/ITRFlipper
If you want to build this yourself and have a true flip like Flipboard (where it flips in the middle, with appropriate shadows), taking a snapshot of the view and using a simple matrix transform can do the trick. I'm not going to go into all the details as I could talk for hours about animation timing and shadows.
i am trying to use the combined camera (found it under "Examples").
the problem is when i use it in Orthographic mode i still see the arrow and the box helper like in perspective view.
for example, when i am trying to zoom in with the mouse scroll, i can see the plane in the same size (as it supposed to be in orthographic view) but the arrows and the small box between the arrows is getting smaller or bigger.
when i tried to debug it at the renderer function i saw the camera is still in orthograpic mode when it render the arrows.
any idea how can i make all the object to be in orthograpic view but still use the combined camera?
edit:
i am not sure which part of the code i should post so i add a picture to describe my problem.
you can see that i am in an orthographic camera and i'm trying to zoom in and i can see the axis arrow getting bigger or smaller.
the difference between the plane when zooming
Found a possible answer which worked for me:
under TransformControls.js
change the update function to:
scale = (camera.inOrthographicMode == true)? 100 : worldPosition.distanceTo(camPosition ) / 6 * scope.size;
I am trying to achieve an effect similar to one of the cardboard app examples that Google has put out with their cardboard app called the 'exhibit'. I have a 3D object that I want to rotate using device orientation control. Right now with just the device orientation control, I can view the 3D object but when I turn around, the camera rotates (it seems) causing the object to fall out of view until I turn all the way back around to where it was in the beginning. In other words the camera seems to rotate in its axis as I look around. What I want is to be able to rotate the object as I turn around.
Kinda like this example http://threejs.org/examples/#misc_controls_orbit except I want to rotate using device orientation control.
Any idea how I can incorporate this feature?
Thank you for your assistance.
The answer to my own question for future seekers is to replace camera in
controls = new THREE.DeviceOrientationControls(camera, true);
with the 3D object you are trying to rotate.
i have a scrollview which contains a pdfpage rendered with CATiledLayer, i want to draw stuff onto the pdf page so i created a overlay layer, i need the graphic to look vectorized so i decided to use CATiledlayer for the overlay layer. Only problem is that it is very slow to draw (I'm using beizerpath to draw), then i tried to optimize it by creating the overlay layer with the visible height and width when zooming in and out, so i don't need to create the overlay for the whole content bound. But still no luck , i want to try CALayer but the draw path just becomes blurry and pixelated, so i'm not sure how i can improve on this. I also tried drawinrect but for some reason it doesn't seem to work.
I suggest not using bezierpath to draw annotations since it requires you to redraw the whole path every time the pen moves. It would be better if you draw only the current line segment using CGContextAddQuadCurveToPoint.
At touchMoved, get the current point and 2 prev points
Using those points, get the area where the line should be drawn
draw at that area in drawRect using setNeedsDisplayAtRect
inside drawRect, go to the end of your path and add your new line using CGContextAddQuadCurveToPoint