How do I set Google login credentials as an environment variable (or "pass them to a method")? - ruby

The documentation for the roo library says that in order to use Google spreadsheets I need to
set the environment variables ‘GOOGLE_MAIL’ and ‘GOOGLE_PASSWORD’ or
you pass the Google-name and -password to the Google#new method.
I'm new to Ruby, naively just tried to change the environment variable (on Windows) by doing this in the system properties, but it seems that it's not environment variables in this sense (and I guess that'd be a bad way to store sensitive data anyway)
I've deleted the environment variables from my (user not system) settings again, so back to square one. How do I follow this instruction? I don't understand what it means by "pass the Google-name and Google-password to the Google#new method", I'm trying to run the line
oo = Roo::Google.new('"'+sheetfull'"')
and I could ask for the details in the program rather than changing system settings (to make it easier for other people to use my code) with something along these lines
puts "What's your email?"
GOOGLE_MAIL = gets
puts "What's your password?"
GOOGLE_PASSWORD = gets
so that these 'environment variables' are set up before the spreadsheet is called with oo, or else it results in an error.
I'm not quite sure how I'd tell it that they've been given though... I tried the code above but it's obviously initialising GOOGLE_MAIL and GOOGLE_PASSWORD as constants(?) that don't get "passed to any methods"
Sorry if I've worded this poorly, I'm still learning all the lingo! Feel free to call me out on any of the things I've named etc. etc.

You can set environment variables using the instructions here: https://superuser.com/questions/284342/what-are-path-and-other-environment-variables-and-how-can-i-set-or-use-them
Alternatively, you can pass the values you prepared to roo like so:
Roo::Google.new(sheetfull, user: GOOGLE_MAIL, password: GOOGLE_PASSWORD)

Related

Setting a global value (and keeping it) within the scope of an eval

I've got a large Rails 5 app (Ruby 2.6.x at present) that makes crucial use of Kernel::eval (please don't tell me to try to refactor this out because eval is dangerous - I didn't write the original code, and this is not in the cards for any time soon).
There are a very wide variety of Ruby expressions (coming out of the db) that can be passed to eval, sometimes of great complexity, making extensive use of classes and resources of the app.
(you might want to jump straight down to BIG EDIT below)
What I want is to be able to set a global value ($global) that will be seen within the scope of the eval execution, but that will not "infect" any of the execution context outside of that. I can't try to interpolate this into the string and pass it down though method params and such, because, as I say, the code being eval'ed is complex and stacks can get very deep, and I want the value to potentially be accessed (though never modified) anywhere within.
I understand about Bindings. I have played around with setting local and instance vars in a binding, and passing this to eval, but inevitably these are not seen inside any method calls within the eval, especially if I'm inside a method of some random class (which I always am). Seems like global is the only possibility. But experimentation shows that a global set inside an eval remains in the code that calls the eval:
2.6.3 :002 > $foo
=> nil
2.6.3 :003 > eval("$foo = 12")
=> 12
2.6.3 :004 > $foo
=> 12
Although I might find some hacky way to deal with this situation, I'm sure you can see where I'd really rather not.
The Binding class offers methods to set local and instance vars dynamically within a Binding object, but nothing for globals (apparently). I've thought about something like this:
...
eval code_string, get_binding()
...
def get_binding
$global = :special_value
binding
end
but I'm really worried, with a Rails app that might be servicing lots of requests at the same time, that these settings of $global will step on each other in unpredictable ways. Related clarifying question: Is a global value in a Rails app global to the entire thing, readable and writable within the scope of all the requests whose servicing may be overlapping in time? (I'm running under Passenger, if that means anything)
So this is a fairly simple and straightforward problem when you understand it, although oddly not addressed in anything I can google about it, and I think I've written enough words. Thanks for any help or ideas to try.
BIG EDIT:
Ok, let me try to refocus this in a different way. I'm getting that the scope of a global can never, no-how, be constrained (duh, right?), but how about this strategy (similar to above):
...
eval code_string, get_binding()
...
def get_binding
luaapg = :special_value ## local used as a pseudo-global
binding
end
So, now I've got this Binding that includes the local var luaapg. I've confirmed that. I eval code_string with this Binding. When I am somewhere inside the execution of code_string, where do I find luaapg - how do I access it? If you look at pretty much every tutorial on this stuff on the web, they show you puts eval("luaapg", get_binding) and voila, the assigned value comes out! But this is too simplistic for real life. When I am in the middle of my code_string, in some method scope of some class, luaapg is not there. I had great hope that this would work, even deep down the stack:
TOPLEVEL_BINDING.local_variable_get(:luaapg)
but it doesn't (I learned about TOPLEVEL_BINDING from here - thanks to that author). So this is the new question: what does it mean to say that I have executed (eval'ed) my code_string in the context of that Binding, which contains a local variable, if I have no way to access that variable, other than with the most simpleminded code? (incidentally I played around with instance vars too - same thing). I'm still hoping there's some magic incantation...
I think you've put your finger on it in the name of the type of variable - it's global - common to all the code in the executing program. I'm not sure exactly how Passenger works but I suspect it runs several copies of your program, so it won't be common between the copies.
To get reliable shared information I think you're going to have to use your database or some sort of information cache like memcached. You choose how you save/name it there.

What benefit does discriminating between local and global variables provide?

I'm wondering what benefit discriminating between local and global variables provides. It seems to me that if everything were made a global variable, there would be a lot less confusion.
Wouldn't declaring everything a global variable result in fewer errors because one wouldn't mistakenly call a local variable in a global instance, thereby encountering fewer errors?
Where is my logic wrong on this?
Some of this boils down to good coding practices. Keeping variables local also means it becomes simpler to share code from one application to another without having to worry about code conflicts. While its simpler to make everything global, getting into the habit of only using global variables when you actually have to will force you to code more efficiently and will make your code more structured.
I think your key oversight is thinking that an error telling you a local variable doesn't exist is a bad thing - it isn't. You've made a mistake and ruby is telling you so. This type of mistake is usually easy to fix: you've misspelled something or you're using something that you forgot to create.
Global variables everywhere might remove those errors but they would replace them with a far harder set of errors to reason about: accidentally using a variable that another bit of code is using. Imagine if every time you called a function (one of your own or a standard library one or one from a gem) you had to check which global variables it might change (and which functions it called, since it might also change global variables) If you make a mistake then you might get an error message (if the class of the object in the variable changes enough) but often you would just silently get incorrect results (if the value of a variable you were using changes unexpectedly).
In general global variables are much harder to work with and people avoid them when possible.
If all variables are global, every line of code in every program (including those which haven't been written yet) written by every programmer on the planet (including those who haven't been born yet or are already dead) must universally, uniquely agree on the names of variables. If you use a variable name that someone else on a different continent two years from now will also use, both of your programs will break, when used together.

Strange env variables in Mac OS X

I'm sometimes using some environment variables in my shell to pass some values to some shell or Ruby scripts. They use the value if the variable is set.
A few times I've noticed that some conditionals statements (based on the presence of such a variable) were executed even if I didn't set a value for the variable $DESTINATION. My scripts are exitting with an error and print the value. It's always something like /var/folders/9X/9XWGo2YVHP0iDllOTi886E+++TI/-Tmp-/[a file name]
As far as I can diagnose this, [a file name] is always something downloaded by something like Sparkle (the library that helps downloading and installing software updates in applications).
It's not that bad, but a little annoying that it is leaking in a completely different context than the one it's used in.
Anybody to confirm or deny my conclusions ?

What does SHOW ALL show?

What is the technical name for the variables shown by the SHOW ALL command in SQL*Plus? I want to know because the term "variable" is quite overloaded and I'd like to be specific.
Typing "help show" says :
SHOW
----
Shows the value of a SQL*Plus system variable, or the current
SQL*Plus environment. SHOW SGA requires a DBA privileged login.
...
Are ALL of the variables listed considered "SQL*Plus system variables"? I've seen some sources which refer to them as "SQL*Plus environment variables", others which call them "parameters".
Are some of them "system variables" while others are "environment variables"/"parameters", or are these terms interchangeable, or what?
According to Oracle:
Sho[w]
...
Shows the value of a SQL*Plus system variable...
Looking at the list, they are, in fact, all variables that are available only within SQL*Plus. It's not terribly surprising that you see them referred to in various forms, as most people will use the term that makes the most sense to them, rather than looking up the canonical term.
If you specify SHOW PARAMETERS instead, you get a list of initialization parameters, which are something else altogether.

How to safely let users run arbitrary Ruby code?

I realize this sounds a little crazy, but I'm working on a project for which I need a server to run user-provided Ruby code and return the result.
I'm looking to prevent something like this:
system("rm -rf /")
eval("something_evil")
# etc...
I'm sure there must be some reasonably safe way to do this, as it already exists at places like tryruby.org. Any help is greatly appreciated, thanks!
Three suggestions:
1) Take a look at Ruby taint levels. This provides some degree of protection against, eval('evil_code') type things, etc.
2) Unless user's actually need access to the local file system, use something like fakefs
3) No matter what else you do follow Tronic's suggestion (can be a pain to setup, but limited chroot jails are about the only way to make absolutely sure that user's cannot access resources you don't explicitly want them to).
Run the program ptraced with a whitelist of allowed syscalls, as user/group nobody, with resource limits (memory usage etc), in a minimal chroot.
A "blank slate" is an object stripped of (most of) its methods.
A "clean room" is an object within which you evaluate potentially unsafe room.
If you evaluate the code in a "clean room" which is also a "blank slate," cranking the safe level up as high as it will go, you will afford yourself a great deal of protection. Nothing in security is sure, so this should be considered a layer in your security, not necessarily the only layer.
This answer shows how to do it.
I had the same problem but then came across eval.so and decided to write an API wrapper for it, called Sandie. It's as easy as:
sandie = Sandie.new(language: 'ruby')
# => #<Sandie:0x00000002e30650>
sandie.evaluate(code: 'puts "hello world"')
# => {"stdout"=>"hello world\n", "stderr"=>"", "wallTime"=>487, "exitCode"=>0}
It also supports a whole lot of other languages as well like C#, Perl, Lua, and Java.

Resources