My projects is growing. It includes about 16 thousands .m4a (sound) files, because it's the App helping to learn languages with examples, but only few classes and files containing code.
Since I 've added those 16000 files working on this project is PITA. Renaming any file takes time, compiling, building, launching the app takes so much time. Of course I know that about 200MB has to be transfered, but the problem is the compouter is responding badly at that time.
Fortnately I have a SSD drive and 8GB RAM, I don't want to even think, how long would it take on HDD.
Is there any way to improve the perfomance?
I'll be also responsible for creating more than a ten similar apps for other pair of languages, and I would like to have all of them in one project and only play with targets. So if I don't do anything with performance now, there is high probability than one day I'll throw away this computer through the window of my house on the 2nd floor...
You can try downloading each m4a from the web once you need it. means - the app will be thin when a user download it, but once a sound file have to be played - it'll be downloaded from the web and saved on the SD. next time you have to play this file - play it from the SD.
And yeah, XCode have many problems - this is one of them..
I have solved the problem, by creating an additional Core Data sqlite file, that contains all of the resources, so the entity looks like:
name (nsstring) - name of file
data (nsdata) - binary of file
works like a charm. Quick builds, quick debugs, just like before.
Related
TL:DR - How can I stream content (specifically music and videos) from a Cloud Storage solution, like Google Drive without having the entire file cached first? My goal is to create a Netflix/YouTube-esque experience with my movie/music library.
So, this seems to be an issue that many people are having, and so many forum posts say that PlexCloud is the solution, but it isn't available anymore, so I want to find another way.
Essentially, I would like to free up space on my local machine, offloading my movies and music to the cloud. I would like these files to be available instantly from any of my devices.
The solutions I have come across so far are:
Google File Stream (or similar)
Expandrive
CloudMounter
These apps mount your cloud storage as a network drive and allow you to store files on the cloud and have "instant" access. These sound great in prinicple, but the issue with all of them is that the entire file has to be cached first before you can watch/listen. This defeats the whole purpose of having the files saved to the cloud, as every time you want to watch a video, the entire file has to be cached. This is very inconvenient for me, as I have a rather slow internet connection, monthly transfer limits, and you have to wait until the file has been cached before you can watch.
The closest I've got to making this work is with Kodi, but the interface is horrible on anything other than a TV. On desktop or mobile, it's useless! But, as far as functionality, the way it retrieves files is perfect. On their website, it says that it only caches up to ~60MB at a time, meaning you can start watching/listening instantly, and the file doesn't need to be cached in its entirety.
So my questions are:
Is there an alternative to Kodi that works on all major OS's, where the files are instantly available and the caching system works like YouTube, Netflix, where only a small portion of the file is cached at once?
Is it actually possible to play a video natively in the OS (in an app like VLC) before the entire video is stored on the local disk, either in storage or in cache?
If so, how would I go about doing this?
A few conditions for the solution:
I don't want to have to use the browser every time - A desktop/mobile app, Finder, or File Explorer is essential.
Ideally something that will run on Android TV, or at least is able to use Chromecast.
Files must be instantly accessible - nothing that will cache the entire file first (unless this is impossible due to how OS's work).
If possible, I would prefer NOT to have to go through some massively complicated set up with coding, terminal commands, or using a dedicated server. The solution must use cloud storage, ideally with an app that works on major OS's.
Thanks in advance for help and suggestions!
TLDR: What is the reason for the complex file management systems in place, such as Github repositories, when working in Visual Studio?
This has been bothering me for a while. I've finished a diploma course in Digital Media, and have started another course in programming. One thing that stuck out immediately after coming from 3D art is how incredibly awkward and obtuse basic file management is when working with Visual Studio. Presumably the same issues arise with other development environments, as if they didn't I can't imagine why anybody would ever use VS.
For example, let's say I want to work on a project in 3ds Max. It's stored on a shared network drive, so I don't want to risk two people accessing it at the same time and saving over each others work. I simply grab the folder or file that I want to use, copy and paste it with a new name, and then I'm good to go.
Saving things with a new name is easy, just save as, rename it. I can work from network drives, local drives, portable drives. The file can come from anywhere and be saved anywhere. Everything is fast, painless, and clear.
If I was to try and do the same thing in VS, for starters, it wouldn't let me build the program while saved to the network, so I'd have to copy it over to a local drive. Presumably this is to prevent the "multiple people accessing, saving over each other" issues that are easily avoided by just renaming the thing.
If I wanted to iteration save, that is, to frequently save the project with a version number name to allow easy rollbacks and troubleshooting, I'm not even sure how I'd do it. Renaming projects/solutions has proven so hard to do that I've had to delete projects and make them again with a new name, rather than try and figure out how to it properly.
There are all sorts of complex file management systems that VS seems to require for any large project work, all of which would be completely unnecessary if you could just copy, paste, rename and save-as with any real ease.
I'm obviously rather new to this, and I'm certain that there is an important reason why it's so awkward to manage files, I just don't know what that reason is. I feel like I'd have a far better understanding of how all these file management systems actually work if I knew why they existed in the first place. At the moment, just trying to be able to work from a network drive is taking up hours when it would be a non-issue in every other digital media field I've worked with.
Preamble:
Recently I came across an interesting story about people who seem to be sending emails with documents that contain child pornography. This is an example (this one is jpeg but im hearing about it being done with PDFs, which generally cant be previewed)
https://www.youtube.com/watch?v=zislzpkpvZc
This can pose a real threat to people in investigative journalism, because even if you delete the file after its been opened in Temp the file may still be recovered by forensics software. Even just having opened the file already puts you in the realm of committing a felony.
This also can pose a real problem to security consultants for a group. Lets say person A emails criminal files, person B is suspicious of email and forwards it to security manager for their program. In order to analyze the file the consultant may have to download it on a harddrive, even if they load it in a VM or Sandbox. Even if they figure out what it is they are still in this legal landmine area that bad timing could land them in jail for 20 years. Thinking about this if the memory was to only enter the RAM then upon a power down all traces of this opened file would disappear.
Question: I have an OK understanding about how computer architecture works, but this problem presented earlier made me start wondering. Is there a limitation, at the OS, hardware, or firmware level, that prevents a program from opening a stream of downloading information directly to the RAM? If not let's say you try to open a pdf, is it possible for the file it's opening to instead be passed to the program as a stream of downloading bytes that could then rewrite/otherwise make retention of the final file on the hdd impossible?
Unfortunately I can only give a Linux/Unix based answer to this, but hopefully it is helpful and extends to Windows too.
There are many ways to pass data between programs without writing to the hard disk, it is usually more of a question of whether the software applications support it (web browser and pdf reader for your example). Streams can be passed via pipes and sockets, but the problem here is that it may be more convenient for the receiving program to seek back in the stream at certain points rather than store all the data in memory. This may be a more efficient use of resources too. Hence many programs do not do this. Indeed a pipe can be made to look like a file, but if the application tries to seek backward, it will cause an error.
If there was more demand for streaming data to applications, it would probably be seen in more cases though as there are no major barriers. Currently it is more common just to store pdfs in a temporary file if they are viewed in a plugin and not downloaded. Video can be different though.
An alternative is to use a RAM drive, it is common for a Linux system to have at least one set up by default (tmpfs), although it seems for Windows that you have to install additional software. Using one of these removes the above limitations and it is fairly easy to set a web browser to use it for temporary files.
In most of the games and programs you download, you just get the installer.
Some .exe files can be ran straightly, though (it's probably cause they don't have much source files to extract, huh?).
I was wondering, what's the difference between an installer, that just extracts the files, and a zip (rar, iso..) file, that you could download ,just depending on your internet speed, in up to few seconds. And where does a, maybe 200mb, installer fetch the, let's say 5gb of, files, offline?
I've never heard about this, and I'm learning to program, so I'd appreciate if you could answer me properly.
What you're really asking is:
How does an installer work?
A bit of background.
In the Before Times, man did not have such things as "installers." Software was run directly off of floppy disks (and none of that rigid 3.5" crap, I'm talking disks that flopped), like God intended.
Then came the first home computers with persistent hard drives. For the first time, it made sense to copy a program off a disk and have it stick around.
But programs still worked the way "portable" applications do today: you copied them as-is and ran them as-is.
Then operating systems began to get more complicated.
Windows introduced this notion of a registry: a central location where program and operating system configuration could be stored. Software authors began using this registry. Its arcane architecture and user-hostile editing utility (the infamous regedit.exe) made it the perfect place to store shareware information -- how many days you have left on your trial, for example.
This happened around the same time that programs began to be too large to fit -- uncompressed -- on a single floppy disk. A way was needed to split a program onto multiple disks. Since it wasn't very user-friendly to require the user to have e.g. a ZIP extractor installed (remember, this was before ubiquitous Internet), Windows programs began to be shipped with installers. You can think of these as basically portable versions of WinZIP whose sole purpose was to reassemble and extract a compressed file.
These days, installers serve a number of other purposes:
providing a convenient user interface
prompting the user to accept a click-through end-user license agreement (EULA)
prompting the user for CD keys (though this is being phased out for many systems in favor of digital distribution)
asking the user to register their software
and so on. They may also serve as DRM vehicles, validating CDs and decrypting data to prevent villainous individuals (yarr) from brrreakin' ye olde DMCA.
At their heart, they aren't any more complex than in the Windows 95 days -- a glorified unzip program.
Sidenote: Where does the installer get 5GB of data from 200MB of archives if not the Internet?
That's high, though there are plenty of ways you could get that compression ratio. Imagine a complex game whose world is defined in verbose XML -- that's readily compressible. You could even get that back in the old WinZIP days.
A zip file can only hold some files and then you unzip and get those files as is.
An installer however can be a very complicated program. It can create the needed files or folders structures, It can register the required dlls on your system, give you the options of the features that can be installed, Check your system for the compatibility and also be used as a wizard to guide you, step by step, to custom install you application.
An Installer (esp. Windows Installer) can make automatic Registry entries, as well as unpack and write files to a directory. With the Zip, you have to manually extract the files, and get no automatic registry edits.
The advantage to a zip is that it guarantees (most of the time) that the application is portable, that all necessary files are included in the unzipped directory.
The advantage of an installer is pretty obvious: automated, UI.
As for the 200mb -> 5gb....compressing the files into an exe can add another layer of more/better/smaller compression than that of just simply throwing the files into a zipped folder, however 200mb -> 5gb is a pretty big jump, not impossible, just pretty big. For most installers that do have instructions for large external (online) downloads, they typically let you know before hand that they are about to download a large chunk of data and to not disconnect from the internet during install....
An Installer or EXE Can Be Easily Get Affected By Virus But if there is ZIP archive than there are less chances for virus affection and using zip is more flexible too because it can be protected using you own password too.
Another Normal Benefit is that ZIP compress the files too.
Hope You are getting me.
So, I've done a bit of reading around the forums about AssetBundles and the Resources folder in Unity 3D, and I can't figure out the optimal solution for the problem I'm facing. Here's the problem:
I've got a program designed for standalone, that loads "books" full of .png and .jpg images. The pages are, at the moment, the same every time the program starts. At the start of the scene for any "book", it's loading all those images at once using www.texture and a path. I'm realizing now, however, that this is possibly an non-performant method for accessing things at runtime -- it's slow! Which means the user can't do anything for 5-20 seconds while the scene starts and the book's page images load up (on non-legendary computers). SO, I can't figure out which of the three things would be the fastest:
1) Loading one asset bundle per book (say 20 textures # 1 mb each).
2) Loading one asset bundle per page (1 mb each).
3) Either of the first two options, but loaded from the resources folder.
Which one would be faster, and why? I understand that asset bundles are packaged by unity, but does this mean that the textures inside will be pre-compressed and easier on memory at load time? Does the resources folder cause less load time? What gives? As I understand it, the resources folder loads into a cache -- but is it the same cache that the standalone player uses normally? Or is this extra, unused space? I guess another issue is that I'm not sure what the difference is between loading things from memory and storing them in the cache.
Cheers, folks...
The Resource folders are bundled managed assets. That means they will be compressed by Unity, following the settings you apply in the IDE. They are therefore efficient to load at runtime. You can tailor the compression for each platform, which should further optimize performance.
We make expensive use of Resources.Load() to pull assets and it performs well on both desktop and mobile.
There is also a special folder, called StreamingAssets, that you can use to put bundled un-managed assets. This is where we put the videos we want to play at runtime, but don't want Unity to convert them to the default ogg codec. On mobile these play in the native video player. You can also put images in there and loading them is like using WWW class. Slow, because Unity needs to sanitize and compress the images at load time.
Loading WWW is slower due to the overhead of processing asset, as mentioned above. But you can pull data from a server or from outside the application "sandbox".
Only load what you need to display and implement a background process to fetch additional content when the user is busy going through the first pages of each book. This would avoid blocking the UI too long.
Optimize the images to reduce the file size. Use tinypng, if you need transparent images, or stick to compressed JPGs
Try using Power of 2 images where possible. This should speed up the runtime processing a little.
ath.
Great answer from Jerome about Resources. To add some additional info for future searches regarding AssetBundles, here are two scenarios:
Your game is too big
You have a ton of textures, say, and your iOS game is above 100 mb -- meaning Apple will show a warning to users and prevent them from downloading over cellular. Resources won't help because everything in that folder is bundled with the app.
Solution: Move the artwork you don't absolutely need on first-run into asset bundles. Build the bundles, upload them to a server somewhere, then download them at runtime as needed. Now your game is much smaller and won't have any scary warnings.
You need different versions of artwork for different platforms
Alternative scenario: you're developing for iPhone and iPad. For the same reasons as above you shrink your artwork as much as possible to hit the 100 mb limit for iPhone. But now the game looks terrible on iPad. What do?
Solution: You create an asset bundle with two variants. One for phones with low res artwork, and one for tablets with high res artwork. In this case the asset bundles can be shipped with the game or sent to a server. At run-time you pick the correct variant and load from the asset bundle, getting the appropriate artwork without having to if/else everywhere.
With all that being said, asset bundles are more complicated to use, poorly documented, and Unity's demos don't work properly at times. So seriously evaluate whether you need them.