How to switch Youtube-dl cookies files? - caching

I'm trying to download video from instagram with simple command
youtube-dl https://instagram.com/p/<ID>
It doesn't work properly for me because i'm using my Netherlands VPN (Intsagram wants me to log in).
I logged in my ig account and generated a cookies file to use it with youtube-dl and it works fine! But, if i want to switch my ig account and use another one cookies file from another account it returns and error.
I found out that youtube-dl rewriting my cookies file after first request. The difference between this 2 cookies files in session_id cookie.
So, maybe somebody knows how could i change my accounts?

Related

How to Create OAuth Client ID?

I am trying to run this github project.
For this I need to create credentials and OAuth Client Id.
When I click to create. After selecting web applications and typing the name I see authorised javascript and authorised redirect uri.
So what should I enter there because I tried with keeping both fields empty. But I got error saying no redirect url found for client id
Please help me.
I am using Developer Console for first time can't find any help on internet. This site is my last hope.
I did as said in [comment].2
Here is what I got in Linux shell and the error i received on redirected url opened webpage.
Linux Shell Message
/YouTube-Subscription-Importer/env/lib/python3.6/site-packages/oauth2client/_helpers.py:255: UserWarning: Cannot access subscribe.py-oauth2.json: No such file or directory
warnings.warn(_MISSING_FILE_MESSAGE.format(filename))
Your browser has been opened to visit:
https://accounts.google.com/o/oauth2/auth?client_id=1069660256195-n8adm0dmi70v29i55hcfblftle09hb5n.apps.googleusercontent.com&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2F&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fyoutube&access_type=offline&response_type=code
If your browser is on a different machine then exit and re-run this
application with the command-line parameter
--noauth_local_webserver
Snipe of what redirect url webpage said
**MOST IMPORTANT
I am running this in windows bash.
Now tell me what to do next.
I just simply need a tutorial/how-to use guide for Google developer console credentials and OAuth Screen.
**
You need the redirect URL, you will be redirected from Google OAuth Consent Screen to https://REDIRECT_URL?code=AuthorizationCode, you can just set it to http://localhost.
After creation, you can your find Client ID on the Credentials page.

Is it possible to use RDP with same browser data as admin?

I just tried the RDP of windows 10 and it worked very well.
I have an admin account (named A), then I create another user (named B). When I use account B to log in RDP and open google chrome, I cannot access chrome data (history, cookies, passwords, ...), but it's like a newly installed chrome. So I want to ask is there any way that admin A and user B can use the same chrome data ? Thank you !
Chrome (probably for all the browsers) will store the browser data for different users in different location (location: C:\Users\<current-user-name>\AppData\Local\Google\Chrome\User Data\Default). So if you logged in as a different user, it will use different browser data to start the browser.
If you really want to, probably you need to login your Chrome using the same Google account and then turn on the sync option, which is the easiest way.
Or another idea is that you can try to copy the other users browser data (location: C:\Users\<other-user>\AppData\Local\Google\Chrome\User Data\Default) to your browser data location (location: C:\Users\<your-user>\AppData\Local\Google\Chrome\User Data\Default) (really not recommended because you won't know how Chrome deals with the data and I'm not sure whether this works, I guess it won't work)

Files stored in AWS bucket to only be downloaded if users are logged into Laravel app. Not by directly going to URL

I only want users to be able to download files which are saved in my AWS bucket if they are logged into my Laravel app. I do not want users to be able to download the files if they are not logged into the system and not by directly going to the URL. Can someone please help me achieve this? I think I may need to somehow set the document as private but I'm not sure about this.
There are two options that I know of:
Returning a temporary signed URL which the user can use to download the file directly from S3.
Streaming or downloading the file from S3 through your application.
For the first option you can use the Storage facade's temporaryUrl method like this to allow access for 5 minutes:
$url = Storage::disk('s3')->temporaryUrl(
'file.jpg', now()->addMinutes(5)
);
To download the file through your server you could return:
return Storage::download('file.jpg');
Don't quote me on the following, but I think the advantage of using the S3 temporary url is that the download doesn't have to happen through your server which frees up your server for other requests. The benefit of downloading through your server is that you would be able to authenticate against every request and there won't be a temporary url which a user would be able to share.

How to read files from Dropbox on remote server using Ruby?

I want to run automated scripts to read files from a Dropbox folder on our server. I started looking into the dropbox gems that are out there, and they all seem to require the user to authenticate a session by opening a browser. This obviously doesn't make sense for an automated task. Is there a way to do this without requiring a user to actually open the browser manually?
The reason that they all require a web browser is that Dropbox uses OAuth v1. There is a way around this that may not be 100% in spirit with the Dropbox API T&C.
I would start by creating a Dropbox account that will be the user account you use from the scripts. Manually login as this user and go to the authorization URL for your app and approve it.
In your scripts you'll create an HTTP connection that uses that user id and password to login. You'll need to keep the information in the response that describes the user's session. Use the session information to create a second HTTP connection to the authorization URL. Since the app is already authorized, you'll just need to capture the session token from the redirect URL.
The definite downside to this is that the password for the user is in your script. :P

sfGuardPlugin session: how to reuse it with wget -- or map SID to sfGuardUser

Recently I was asked to add an XML API to one of the Symfony modules in my project. I did it, and it works well. For authentication, we use sfGuardPlugin. Symfony version is 1.3.11. Using Propel, not Doctrine.
The most recent request to me is this:
We will embed a Flash game into the website.
The Flash will do requests to the XML API.
The guy who is coding the Flash application says that it doesn't share cookies with the browser.
We want the Flash to be able to reuse the session of the currently logged in user (we won't allow to be even shown if no user is logged in).
I did try this would-be solution: (taken from other SO articles and various Google search results)
I was told that the Symfony session resides in the symfony cookie.
I was told that if I copy this value in another client (in my case, wget) and do session_id("stolen_session_id") I will be able to duplicate the session, have the same user logged in, etc.
This turned out to be wrong. Say my cookie symfony had the "blabla" value. I did this: wget --post-data='session_id=blabla' X.X.X.X:NN/api/bla.xml -O-. My server PHP code parses this POST parameter and feeds it to session_id function. It then reported in the logs that the session_id('blabla') was returning 1. However, calling $this->getUser()->getGuardUser() returns null.
I need a way to map a passed session_id to a valid sfGuardUser. Or find an alternative way of reusing a session which already exists.
Suppose I have full access to the cookies. I want to know which one of them (or all of them?) to duplicate in order to achieve this.
BTW, I am seeing in my Chrome dev tools that the symfony cookie is of a session type. So it's no wonder at all as to why my method doesn't work, but I am little lost as to how do I do this in Symfony, while using the sfGuardPlugin.
I do realize this is not one of the most informed questions, but ditto, I just need help.
Thanks for your time.
(Dimitar P.)
Oops, forgot to mention which cookies I see on my domain:
symfony
sfRemember
__utma
__utmb
__utmc
__utmz
I am guessing the last four are for Google Analytics, though.
I didn't want to do this, but I was unable to find other alternatives:
$ wget --header='Cookie: symfony=blabla' X.X.X.X:NN/api/bla.xml -O-
I wanted my XML API to be REST, but evidently, Symfony doesn't allow authenticated requests other way than using cookies (and to enable the session ID to be always included in the URL is not an option at all).
Still, if somebody shows up with a fully REST alternative, I will upvote his/her answer.
You will need some way of specifying which user is executing the (wget) request. And PHP sessions use a session ID by default.
A common way to do this is token-based authentication. The most common way to achieve this is OAuth, which has a lot of default libraries (both for Symfony and for your API consumers).
If you're the only one using this API, you can also create a custom token (random sha1 string) per user per session (you can store this somewhere in your database). Now you would create something like ` wget X.X.X.X:NN/api/bla.xml?token=asdfhdsfhf

Resources