Is it possible to legally progressive play Windows Media DRM tracks as they are downloaded from a HTTP link?
I've managed to do this without DRM but someone here told me it wasn't possible with DRM. Is this true?
And if so with what set of libraries or technologies?
I'm pretty sure it's possible and not illegal either. Normally, information about content encryption and actual encrypted content is kept separate in the same container. That is to say, metadata is not encrypted but content is. To support progressive download, the content is normally encrypted i small transferable ordered chunks. Your DRM client uses the metadata to locate a license server, next a license and content key is acquired from the license server. Next, your client downloads the content "chunk-per-chunk" ,decrypts each chunk, and renders it as a motion picture while it is downloading.
For information on SDK's, check out the different components here.
Yes it is.
Related
I have a java/j2ee application deployed in tomcat container on a windows server. The application is a training portal where the training files such as pdf/ppt/flash/mp4 files are read from a share path. When the user clicks a training link, the associated file from the share folder is read downloaded from the share path to the client machine and start running.
If the user clicks mp4/flash/pdf files, it is taking too much time to get opened.
Is there anything in the application level, we need to configure? or it is a configuration for load in the server? or is it something needs attention from a WAN settings?
I am unable to find out a solution for this issue?
Please post your thoughts.
I'm not 100% sure because there is not so much details, but I'm 90% sure that the application code is not the main problem.
For example:
If the user clicks mp4/flash/pdf files, it is taking too much time to get opened.
A PDF is basically just a string. Flash is a client-side technology. And I'm pretty sure that you just send a stream to a video player in order to play a MP4. So the server is supposed to just take the file and send it. The server is probably not the source of your problem if we assume that he can handle the number of requests.
So it's about your network: it is too slow for some reasons. And it's difficult to be more specific without more details.
Regards.
I'm developing a project using CakePHP 3 for server side and Android for client side. In this project I have to send a lot of images of products from the server to the app. When the app requests the images of a product, the server searches in the database for the urls of the images and send them to the app through a json response. Then, in the app I load the images using NetworkImageView from Volley library.
This process works, but the images are heavy in size, so it consumes a lot of data if you use mobile network like 4g. I can't to lose image quality, so I can't treat the images too much.
What I was thinking was to compress (somehow) the images in server side, send the array of bytes through json and uncompress them in the app, so I could minimize data consumption.
I coudn't find any info for what I described above and I'm not sure if this is the right aproach. Any help would be appreciated.
What I was thinking was to compress (somehow) the images in server side, send the array of bytes through json and uncompress them in the app, so I could minimize data consumption.
JSON will increase and not lower the amount of data that needs to be send for obvious reasons. It's an envelope in your use case and the way JSON works it will add more data. Check the JSON spec.
You want to enable gzip compression on your Webserver (Nginx here), check Google for that or superuser.com for more details.
But this won't make a dramatic difference either for mobile use when you send a 20mpx image. I would send small images and only send a lager version when needed, when the user zooms in. Guess that's doable.
I have a large file on window azure and I want to download and save it on my disk. The maximum time for each link on window azure is 60 minutes. If I dowload directly base on link, maybe it isn't enough time. How to download it?
Nathan, your question isn't very clear, but I suspect you are referring to the time allowed for a Shared Access Signature, and being concerned that the client might not download the file within the time allowed?
There are 2 scenarios here:
Once a storage transaction (ie. download file) which uses a SAS
begins, then the transfer will be able to continue past the
expiration of the SAS. It is only new requests which are
authenticated using the SAS and which will fail if they are
attempted past the expiration time on the SAS.
If the client has to resume the download (or is downloading in
blocks), then the client has to be smart enough to detect the failed
authentication after the SAS expires and then re-request a new SAS
from the issuer.
try using a download accelerator like flashgot or something similar ...
One option would be to download the file in pieces and reassemble it once you have the pieces. There are a couple of ways to do that.
If the blob was uploaded in multiple blocks, then you could download each block individually. This is supported directly in the client libraries, so if you can do this it's probably easier. You can also download the blocks in parallel to reduce the total time it takes to download.
You could use HTTP Range headers to get certain byte ranges. I don't believe this is supported in the clients, so you'd probably have to code it yourself. But it will work even if the blob was not uploaded in blocks. I think this could also be done in parallel, but I'm not sure.
I would like to achieve the following:
Set up a proxy server to handle video requests by clients (for now, say all video requests from any Android video client) from a remote video server like YouTube, Vimeo, etc. I don't have access to the video files being requested, hence the need for a proxy server. I have settled for Squid. This proxy should process the video signal/stream being passed from the remote server before relaying it back to the requesting client.
To achieve the above, I would either
1. Need to figure out the precise location (URL) of the video resource being requested, download it really fast, and modify it as I want before HTTP streaming it back to the client as the transcoding continues (simultaneously, with some latency)
2. Access the raw byte stream, pipe it into a transcoder (I'm thinking ffmpeg) and proceed with the streaming to client (also with some expected latency).
Option #2 seems tricky to do but lends more flexibility to the kind of transcoding I would like to perform. I would have to actually handle raw data/packets, but I don't know if ffmpeg takes such input.
In short, I'm looking for a solution to implement real-time transcoding of videos that I do not have direct access to from my proxy. Any suggestions on the tools or approaches I could use? I have also read about Gstreamer (but could not tell if it's applicable to my situation), and MPlayer/MEncoder.
And finally, a rather specific question: Are there any tools out there that, given a YouTube video URL, can download the byte stream for further processing? That is, something similar to the Chrome YouTube downloader but one that can be integrated with a server-side script?
Thanks for any pointers/suggestions!
You should ask single coding questions. What you asked is more like a general "how would a write my application". A few comments though:
squid is a http proxy, video use usually streamed over e.g. rtsp.
yes there are tools that grab the rtsp url from a youtube url, be sure to understand the terms of use for the video servie before going that way though.
gstreamer has a gst-rtsp-server module that contains a rtsp server, that also can be used as a proxy for a given rtsp stream.
I'm researching solutions for a potential client. They're requesting the ability to download a large amount of MP3's (1000+) from their online catalog.
I've researched/tested building a zip containing all MP3s using ZipArchive but ran into obvious memory leak issues that have ruled that solution out.
I'm now trying to think out of the box.
One idea was to create an FTP queue or a Torrent type download link for them. Is there anything out there that can pull something like this off?
Any help or suggested direction would be greatly appreciated! Thanks!!
Edit: Here is the overall process/goal that we're trying to achieve.
The client creates music for TV/Flim placement. They maintain a online catalog AND a local copy they send to potential buyers. The online catalog and the offline catalog need to mirror each other. Problem being, they have multiple offices that will have to update their local copy with the new files added to the online catalog from many different locations
Example: East Coast User updates catalog with 100 new files. West Coast User needs to update the offline catalog with the new files retrieved from the online catalog.
We had hoped to create custom zip's of the files each user needed to update their catalog based on the user's download history that we'd maintain in MySQL. We were testing ZipArchive but we couldn't seem to build Zips over 175 MEG (give or take). We're in the process of testing ZipStreaming but are having some issues.
I hope this clears up the overall goal and problems we are facing.
GNU wget?
It can download recursive. Just give wget a list of all files on the server, e.G.
http://www.example.org/filelist.html which contains links like file1.mp3, file2.mp3 etc (apache normally generates such an index file automatically wenn a directory without index.html/php in it gets called.
http://linux.die.net/man/1/wget
Frankly speaking, I can't identify the actual problem/question from your post. If you are looking for minimizing network load, then you need to remember that MP3 files are not compressed well because they are already compressed (not as well as possible, but well). If you are looking for a transport, than any file transfer protocol will do (FTP, SFTP, HTTP, WebDAV).
If you need flexibility and features, I'd recommend SFTP: this is a protocol for remote file system access, so besides "get file" operation it has plenty of useful operations including machine-readable directory listing (not always available in FTP and not available in standard HTTP), built-in ZLib compression, built-in possibility to resume file transfers and more bonuses. HTTP also has ZLib compression, but this one is not always available.
Update: your approach doesn't care about what is really available on the client and you are going to prepare ZIP files based on your (possibly incorrect) knowledge of the client already has.
If the client and server are both applications that you develop, then you should use RSync protocol or something similar to update data online (not using any ZIP files) and download the files that are missing on the client. If direct communication between the client and the server is not possible, you can make the client send his state to the server and the server will prepare an individual package after that. As for ZIP functionality - it's needed only when you use batch update (no real-time communication between the client and the server). I don't know what technology you are using but if your only problem is with ZIP component, you can use something else for data packing - either different ZIP component (for .NET and VCL we have ZIP component) or some other packing solution (for example, our SolFS product doesn't have size limits). Unfortunately I am not aware of RSync-like implementation available as a component.