I've searched a lot on the Google but I don't find the answer that I need.
I've a lot of animation files created for IOS that come with a sprite sheet and relative plist file.
I need to import the frame of these animations in Unity but I don't know how I can do that because there isn't a direct compatibility with plist and I haven't found software that extract the correct frames from sprite sheet. Does anyone know how I can do that?
Use System.XML.XMLDocument (with TextAsset.text) to parse .plist, and TextureImporter.spritesheet with SpriteMetaData to read / write spritesheet data.
I'm using XMLDocument in production (it is fast) to read SVG files. For TextureImporter, here you can see an example where I nudge spritesheets directly with it.
Related
I'm having trouble adding a gif onto my Beamer. I've searched for answers but I haven't found one that was written after Adobe retired it's Flash Player.
I've tried using the animate package, and it does compile the file, but it doesn't let me play it (the controls appear, but if I click them nothing happens). I've tried the media9 and movie15 packages, but apparently they are obsolete. So, I am very confused on how to add an animation at this point in time. And honestly, I don't even know if the lack of Adobe Flash Player is the problem. I know there has to be a simple answer.
I don't think it's necessary to add my code, since I feel it's such a trivial matter.
Thanks in advance!
You can split your .gif file into individual images. There are many different converters to be found online or you could use image magick from the command line:
convert -coalesce test.gif test.png
This will result in a series of images called test-0.png etc.
You can then include these in your beamer presentation using the xmpmulti package.
To animate this sequence of slides, you can use \transduration<0-16>{0}. Replace 16 with how many images you have and {0} with the duration in seconds each image should be shown. With 0 seconds from my example, the time will be determined by how long your computer takes to render the slide.
If you now open the presentation with adober reader in presentation mode, the slides will change automatically and thus create an animation.
\documentclass{beamer}
\usepackage{xmpmulti}
\begin{document}
\begin{frame}
\transduration<0-16>{0}
\multiinclude[<+->][format=png, graphics={width=\textwidth}]{test}
\end{frame}
\end{document}
Or if abandoning the pdf format is an option, you could have a look at the media4svg package.
In a Qt Quick Controls 2 project, I need to draw several animated SVG. As the AnimatedImage doesn't support the SVG animation, I could find a solution to achieve that by creating my own c++ component and by using a timer to refresh the animation every x time. For that reason I know that my animated SVG are supported correctly by the QtSvg renderer.
However I want a solution supported natively by the AnimatedImage qt quick control. Actually I'm able to compile and modify the code of the native QtSvg module, as well as its qsvg plugin, which is the bridge between the QtSvg module and the QImageIOHandler interface, used by the AnimatedImage to draw the content.
But even if I add the necessary code in the plugin to support the animation (Animation property support, functions related to the animation, ...), the AnimatedImage component continue to draw a static SVG.
I dug a lot in the plugin's caller source code, and I deduced some conclusions:
The plugin caller cache the frames in several QPixmap, and read the animation loop only once (then the cache is replayed)
The read() function is called several times in the plugin if, an only if, the source image file is divided in frames, like it's the case e.g in GIF images, or in cursor (.cur) files, so a frame is read and cached, then the file cursor is moved to the next frame, and the next frame is read and cached, and so on
For that reason the way the SVG images are animated may be incompatible with the Qt animated image engine, because the whole SVG data are read only once, which notifies the engine that the image contains only one frame, instead of considering that the frame count should be determined by the duration and the FPS
For that reason I have the following questions:
Are my above conclusions correct? If not, how is working the AnimatedImage and the Qt image plugins under the hood?
Is there a way to create a Qt image plugin which supports the animated SVG images? If yes, how can I do that?
I could not find a good, well explained document about how to create animated image plugin, can someone advise me one?
Why Qt didn't plan to support animated SVG natively, and is there anything planned in the future to fix a such situation?
NOTE I know that I may use the QtWebEngine to support animated SVGs, but it's not an option for me, because:
QtWebEngine adds ~300Mo to my application. As its current size is ~10Mo, a such increase isn't relevant to just show several animated images.
QtWebEngine brings several side effects in my case, which have no response, like e.g: https://forum.qt.io/topic/114326/webengine-how-to-draw-an-animated-svg-above-a-transparent-web-view
So please don't propose a such alternative as a solution.
I haven't dabbled with animated gifs for a very long time. I'd like to create something similar to this:
https://dribbble.com/shots/2941889-Wind
I can't imageine whoever created this animated frame-by-frame to get that result. So i'd like to know what tools there are for doing something like that? Or perhaps i'm all wrong and the only way to do it is frame-by-frame?
Just some advice where to get started if anyone is able.
So what i did in the end was use Adobe Illustrator to create the paths.
I then changed the stroke to dashed. Modifying the dash i was able to make it appear as thought the line was growing along the path. I then created more paths.
I then created a js file to incrementally increase the dash and save a gif. This created me over 100 snapshots.
Imported these into Adobe Photoshop and export! Bingo!
I need to import animations from Maple into my LaTeX/Beamer presentation. I save a file in GIF format. But later I have problems converting that file into PNG. All I get is a static PNG file and can't proceed ((( What's the full code to do that in LaTeX?
You can use the animate package to animate a series of PNGs. To get the series of PNGs from an animated GIF, use a tool like ImageMagick's convert.
Does this help: LINK? (This is the same answer as marcog... just wanted to provide a reference to it being asked previously -- the solution was the same: the animate package).
Also, your OS will matter. I don't know that Linux (not saying you're using it) has any ability to play animated PDFs. I've tried embedding movies using LaTeX and while it "works," you can't actually view them in anything Linux offers yet. Okular is working on it, but last I checked (couple months?) it's not possible yet.
Anyway, just wanted to add that just in case you were doing everything completely right and by chance are not seeing the fruits of your labor since you're using a Linux viewer. Check your work with Acrobat on Windows to be sure.
I am trying to do animations on iPhone using OpenGL ES. I am able to do the animation in Blender 3D software. I can export as a .obj file from Blender to OpenGL and it works on iPhone.
But I am not able to export my animation work from Blender 3D to OpenGL. Can anyone please help me to solve this?
If you have a look at this article by Jeff LaMarche, you'll find a blender script that will output a 3D model to a C header file. There's also a followup article that improves upon the aforementioned script.
After you've run the script, it's as simple as including the header in your source, and passing the array of vertices through your drawing function. Ideally you'd want a method of loading arbitrary model files at runtime, but for prototyping this method is the simplest to implement.
Seeing as you already have a method of importing models (obj) then the above may not apply. However, the advantage of using a blender script is that you can then modify the script to suit your own needs, perhaps also exporting bone information or model keyframes.
Well first off, I wouldn't recommend .obj for this purpose since the obj file format doesn't support animation, only static 3D models. So you'll need to export the animation data as a separate file that you load at the same time as the obj.
Which file format I would recommend depends on what exactly your animations are. I don't remember off the top of my head what file formats Blender supports, but as I recall it does not export Collada files with animation, which would be the most general recommendation. Other options would be md2 for character animations, or 3ds for simple "rigid objects moving around" animations. I think Blender's FBX exporter will work, although that file format may be too complicated for your needs.
That said, and assuming you only need simple rigid object movements, you could use .obj for the 3D model shapes and then write a simple Python script to export a file from Blender that has at the keyframes listed, with the frame, position, and rotation for each keyframe. Then load that data in your code and play back those keyframes on the 3D model.
This is an old question and since then some new iOS frameworks have been released such as GLKit. I recommend relying on them as much as possible when you can, since they take care of many inherent conversions like this, though I haven't researched the specifics. Also, while not on iOS, the new Scene Graph technology for OS X (which will likely arrive on iOS) in the future, take all this quite a bit further and a crafty individual could do some conversions with that tool and then take the output to iOS.
Also have a look at SIO2.
I haven't used recent versions of Blender, but my understanding is that it supports exporting mesh animation as a sequence of .obj files. If you can already display a single .obj in your app, then displaying several of them one after another will achieve what you want.
Now, note that this is not the most efficient form to export this type of animation, since each .obj file will have a lot of duplicated info. If your mesh stays fixed over time (i.e. only the vertices move with the polygon structure, uv coords, etc. all fixed) then you can just import the entire first .obj and from the rest just read the vertex array.
If you wanted to optimize this even more, you could compress the vertex arrays so that you only store the differences from the previous frame of the animation.
Edit: I see that Blender 2.59 has export to COLLADA. According to the Blender manual, you can export object transformations, and you can also export baked animation for rigged objects. The benefit for you in supporting the COLLADA format in your iPhone app is that you are free to switch between animation tools, since most of them export this format.