How to use the "Project Drawer" in TextMate 2 when it doesn't seem to exist? - textmate

On TextMate 2 and opening two files in two different locations such as /path/1/file.txt and /path/2/file.txt, I am no longer seeing a way to perform diffs as before since one cannot select files in the project "drawer." We now have a file browser that seems to have taken its place and thus no way to pick the two opposing files. This also precludes any other command that requires multi file selection that are not within the file structure.
Am I missing something that would allow this to work properly when dealing with files in two different paths?

This isn't a new trick. It's one we learned when grep in project would go insane when you had a project with files whose common ancestor was root or some directory far above the files. Instead of opening your files like:
mate /foo/bar/baz /quix/quacks/quux
You do the following, assuming you're in an empty directory or don't care that its files will be included in the project as well
ln /foo/bar/baz /quix/quacks/quux . && mate .
That can obviously be wrapped up into a function to reduce the syntactical difference. In fact, at one point, I actually wrote a wrapper script around mate to do that transparently when needed AND clean up the hard linked files after I closed the project or quit TextMate. That went away with some bad hard drive though.
Anyhow I HTH

Related

Moving files to corresponding subdirectory in Python

Every week at work, I am responsible for manually moving 100-200 files from one folder into a corresponding subfolder. After doing this for a couple of weeks, I thought to myself: This can be done faster!
I have used Python 2.7 and 3.X a bit at school, but mostly with (very) basic search engines and text search.
I found another thread, where a guy was told to use either os.rename or shutil.move. I made a simple test with os.rename:
os.rename("path/to/current/file.foo", "path/to/new/desination/for/file.foo")
And it works, so far so good.
Is there any way to make python run through every file from a folder and move it into a corresponding subdirectory in another folder? The original directory contains all the files, while the target directory contains all the folders.
Every file (A_file, B_file, etc.) has the same name as the folder(A_folder, B_folder, etc), which means they are in the correct order.
This makes me think a simple iteration could work, as in(More of an algorithm than code):
for file in original_dir
move file to folder_x in tar_dir
x += 1
Obviously this is not complete, but maybe someone can point me in the right direction.
This makes directories recursively.
os.makedirs(path)
So you pass the path to the directory you want. eg /path/to/
Which you would follow up with the copy.
def move_file(new_path_to_file, file_to_move):
file_name = file_to_move.split(os.path.sep)[-1]
os.makedirs(new_path_to_file)
os.rename(file_to_move, os.path.join(new_path_to_file, file_name))
You could also make it easier by passing in the filename as well.

Expressions in a build rule "Output Files"?

Can you include expressions in the "Output Files" section of a build rule in Xcode? Eg:
$(DERIVED_FILE_DIR)$(echo "/dynamic/dir")/$(INPUT_FILE_BASE).m
Specifically, when translating Java files with j2objc, the resulting files are saved in subfolders, based on the java packages (eg. $(DERIVED_FILE_DIR)/com/google/Class.[hm]). This is without using --no-package-directories, which I can't use because of duplicate file names in different packages.
The issue is in Output Files, because Xcode doesn't know how to search for the output file at the correct location. The default location is $(DERIVED_FILE_DIR)/$(INPUT_FILE_BASE).m, but I need to perform a string substitution to insert the correct path. However any expression added as $(expression) gets ignored, as it was never there.
I also tried to export a variable from the custom script and use it in Output Files, but that doesn't work either because the Output Files are transformed into SCRIPT_OUTPUT_FILE_X before the custom script is ran.
Unfortunately, Xcode's build support is pretty primitive (compared to say, make, which is third-odd years older :-). One option to try is splitting the Java source, so that the two classes with the same names are in different sub-projects. If you then use different prefixes for each sub-project, the names will be disambiguated.
A more fragile, but maybe simpler approach is to define a separate rule for the one of the two classes, so that it can have a unique prefix assigned. Then add an early build phase to translate it before any other Java classes, so the rules don't overlap.
For me, the second alternative does work (Xcode 7.3.x) - to a point.
My rule is not for Java, but rather for Google Protobuf, and I tried to maintain the same hierarchy (like your Java package hierarchy) in the generated code as in the source .proto files. Indeed files (.pb.cc and .pb.h) were created as expected, with their hierarchies, inside the Build/Intermediates/myProject.build/Debug/DerivedSources directory.
However, Xcode usually knows to continue and compile the generated output into the current target - but that breaks as it only looks for files in the actual ${DERIVED_FILE} - not within sub-directories underneath.
Could you please explain better "Output Files are transformed into SCRIPT_OUTPUT_FILE_X" ? I do not understand.

Command Prompt: Move a file to an Unknownly named folder

So, is there a possible way to move Test.txt to C:\ProgramData\CsD2\Tools\("Unknown Folder Name")\data\per Using command prompt?
using foxidrives solution for your previous question for detecting the correct directory, then just
move test.txt "%folder%\"
Short answer: yes. Not quite sure what the situation is that has left only the middle part of your path unknown, and the need to use the comnand line, but I have encountered similar cases on Linux and expect the algoirthm can be adapted to Windows commands. It's possible to do this by hand rather than writing a shell script, but it's up to you and your skills.
Permissions matter. Make sure you elevate yours enough to read and write in Tools before continuing.
First, change directory to C:\ProgramData\CsD2\Tools\
Presumably there are many items here. Some may be "hidden," so list the contents of this directory and be sure to include an option to show hidden files and folders. If you can, restrict the search to directories only.
It's tempting to display contents recursively in the above step. It's up to you, but I find it makes the output cluttered without a script to do the rest of the work.
Now it's time to search for the subfolder set that theoretically only exists in your target folder. Suppose Tools contains the directories fldr1, fldr2, and fldr3. Use your command to list a directory's contents with the path "fldr1\data\per", then use "fldr2\data\per", and so on until it doesn't return an error. Per may be empty, but that should look different from the path not found error.
Now you've found the name of your mystery folder. Write it down for future reference.
At thus point, you know the path to Test.txt, and the full path to the destination directory. Do a move command to relocate Test.txt, and you're done. I like to relist the contents of the target directory after to be comfortable that it arrived.

Testing File/Folder Navigation and Manipulation

I am working on a module that supplies methods for navigating directories and manipulating files. Basically it will be a combination of the Dir and File classes, with options specific to the needs of a project I'm working on.
Right now I have started writing tests for some of these methods and things are getting messy.
Example
One of the methods I have is a tree function that returns a hash of files and folders where you can pass options like tree(only: 'folders', limit: 3). In order to test that it only goes down 3 levels, I would have to have 4+ subfolders with dummy files in them.
The Problem
Right now I'm testing on folders outside the project since the subfolders are already there, but I want to move away from this, especially considering the implausibility of testing on system files once I start testing methods equivalent to rm -rf (as well as the lack of portability).
I'm starting to think that I need to create a "lab rat" type folder that I do all my "experiments" on, but I have no clue how to approach creating it.
Do I create a function that creates the files?
Do I pull files and folders from another location?
Do I use some sort of "lorem ipsum" generator for file structures?
Do I make all these files and folders manually(ugh)?
Do I just mock and stub the hell out of everything and not actually create/delete the files and folders?(I don't see this happening)
So...
How would someone normally approach testing excessive amounts of file and folder manipulation?
I don't think you want to use mocks/stubs. The file system of your OS should be well tested and fast, so the benefit of mocks/stubs is minimal. Creating a mock/stub system increases the complexity without much benefit.
Here's my answers:
Do I create a function that creates the files?
Yes. You can create tests for these functions to make sure that they are correct. Instead of calling Dir and File, write helper functions that make the code simple and readable. Maybe you can share the helper functions between the source/test code...
Do I pull files and folders from another location?
Not sure what this is for...
Do I use some sort of "lorem ipsum" generator for file structures?
Yes, if you mean create functions that generate file structures.
Do I make all these files and folders manually(ugh)?
No.
Do I just mock and stub the hell out of everything and not actually create/delete the files and folders?(I don't see this happening)
No. One benefit of creating files/directories is that you can manually check what is going on and not be 100% dependent on the tests. This is actually a good approach because without it there could be a bug where both the source code and test code is not doing what you expect, but you wouldn't know because everything seems to be working.

Operating system independent image addressing

Due to using both Windows and Ubuntu on my computer I'd like to be able to create documents independently. I have one directory for logos and I want to use them in any documents everywhere.
The problem with different file addressing I solved with those commands:
\newcommand{\winlogo}{D:/logo/}
\newcommand{\linlogo}{/media/DATA/logo/}
\includegraphics{\winlogo logo_bw}
How to provide this feature:
if(parameter==windows){adress:=D:/logo/}
elseif(parameter=linux){adress:=/media/DATA/logo}
else{error}
I've run into this problem as well, and I found that hard-coding the paths is an absolutely terrible idea. Also, keeping these directories in sync will eventually be a problem once your projects begin to grow.
The way I solved this was to put everything in version control (I like git, your mileage may vary).
Then I created an images folder, so my folder hierarchy looks like this:
Working-Dir
|-- images/
|-- myfile.tex
|-- nextfile.tex
Then in the preamble of my documents: \usepackage{graphicx} and \graphicspath{{images/}} which tells latex to look for a folder called images, then look for the graphics inside the folder.
Then I do my work on on comp, push my finished work back the repo, and when I switch computers I just pull from my repo. This way, everything stays in sync, no matter which computer i'm working on.
Treating tex source like source code has greatly improved my work flow and efficiency. I'd suggest similar measures for anyone dealing with a lot of latex source.
EDIT:
From: http://en.wikibooks.org/wiki/LaTeX/Importing_Graphics
Graphics storage
There is a way to tell LaTeX where to
look for images: for example, it can
be useful if you store images
centrally for use in many different
documents. The answer is in the
command \graphicspath which you supply
with an argument giving the name of an
additional directory path you want
searched when a file uses the
\includegraphics command, here are
some examples:
\graphicspath{{c:\mypict~1\camera}}
\graphicspath{{/var/lib/images/}}
\graphicspath{{./images/}}
\graphicspath{{images_folder/}{other_folder/}{third_folder/}}
please see
http://www.ctan.org/tex-archive/macros/latex/required/graphics/grfguide.pdf
As you may have noticed, in the first
example I've used the "safe" (MS-DOS)
form of the Windows MyPictures folder
because it's a bad idea to use
directory names containing spaces.
Using absolute paths, \graphicspath
does make your file less portable,
while using relative paths (like the
last example), you shouldn't have any
problem with portability, but remember
not to use spaces in file-names.
Alternatively, if you are using
PDFLaTeX, you can use the package
grffile which will then allow you to
use spaces in file names.
The third option should do you well-- just specify multiple paths for the \graphicspath I wonder if LaTeX will fail gracefully if you just include all of your paths in there (one for images, one for your logs on linux, one for your logos on windows)?
Mica, thank you once more, your advice works properly!
I've tested this code in preamble, in .sty file it doesn't work:
\usepackage{graphicx}
\graphicspath{{/media/DATA/logo/}{d:/logo/}{img/}}
where
/media/DATA/logo/ is address to directory with logos on mounted partition in Linux
d:/logo/ is address to same directory in windows
img/ is address of images for current document in actual working directory
and this code in document:
\includegraphics{logo_zcu_c} from logo dir
\includegraphics{hvof} from img/ dir`

Resources