Read a tiff image in lua löve (love2d) - image

I wrote a little program using lua LÖVE. Now I would like to make it read some TIFF files, since LÖVE does not support this image format. And I failed.
Basically, LÖVE can read the file from some userdata. I thought that I might read the data with another library and convert it internally to a format that LÖVE supports, but can't find anything suitable. I looked at graphicsmagick lua bindings, but unfortunately it does not appear to be up to date. I tried to get it too run, but gave up after a while; I would probably have to rewrite the whole package and I can't even find some of the modules it uses (for example the "sys" module).
EDIT: Some more background. I need a fast image viewer to quickly browse through files on the disk. I do not like to use the file manager for that purpose, and I would like it to behave exactly as I want it to behave. I was using xzgv for this purpose for years.
When I discovered lua and LÖVE, I decided to write one both as an exercise and because I want to have a little tool like that (you can see how it looks like here).

Here is a solution which does not requires any libraries. Basically, the idea is that you convert an image using the convert program from the ImageMagick suite and pipe its output to a filehandle with io.popen. That way the file is read only once from the storage.
local cmd = "/usr/bin/convert %s -format JPG:-"
local file = "test.tiff"
local fh = io.popen(cmd:format(file), "r")
local fdata = fh:read("*a") -- read all
fh:close()
fdata = love.filesystem.newFileData(fdata, file)
local img = love.graphics.newImage(fdata)

Related

Images loading without full resolution (Julia)

I have a .NEF file on my Desktop titled "my_image.nef". If I look at the details of the image, I see a resolution of 4256x2832:
When I try to open this with Julia, I get a two-dimensional array of size 120x160.
How do I get a full-resolution array to load? Why is it loading a much smaller version of the original image?
I'm not an expert on various RAW file formats, but it's probably loading the thumbnail preview. There's good reason to hope this may be fairly easily resolvable: like many RAW formats, it appears to be a variation on TIFF, and Julia's TiffImages package is an amazingly good TIFF library. It's possible you'd have to create a "wrapper package" specifically for RAW or NEF, but it's might end up being a fairly short exercise in piecing together the correct series of calls to TiffImages internals. I encourage you to file an issue at TiffImages to discuss it.
I ended up using the Julia command prompt to iteratively call ImageMagick, locally converting all the .NEF files to .PNG files and reading in the PNG files as arrays.
using Glob
using Shell
filenames = glob(string("*",".NEF"), <IMAGE DIR>)
for file in filenames
fname = split(split(file,"\\")[end],".")[1]
fname1 = string(img_folder, fname, ".NEF")
fname2 = string(img_folder, fname, ".png")
cmd = string("convert ", fname1, " ", fname2)
Shell.run(cmd)
end
Sloppy, but there didn't seem to be a tidy Julia-based package that worked well.

read a .fit file on Linux

How could I read Garmin's .fit file on Linux. I'd like to use it for some data analysis but the file is a binary file.
I have visited http://garmin.kiesewetter.nl/ but the website does not seem to work.
Thanks
You can use GPSbabel to do this. It's a command-line tool, so you end up with something like:
gpsbabel -i garmin_fit -f {filename}.fit -o csv -F {output filename}.csv
and you'll get a text file with all the lat/long coordinates.
What's trickier is getting out other data, ie: if you want speed, time, or other information from the .fit file. You can easily get those into a .gpx, where they're in xml and human-readable, but I haven't yet found a single line solution for getting that data into a csv.
The company that created ANT made an SDK package available here:
https://www.thisisant.com/resources/fit
When unzipping this, there is a java/FitCSVTool.jar file. Then:
java -jar java/FitCSVTool.jar -b input.fit output.csv
I tested with a couple of files and it seems to work really well. Then of course the format of the csv can be a little bit complex.
For example, latitude and longitude are stored in semicircles, so it should be multiplied by 180/(2^31) to give GPS coordinates.
You need to convert the file to a .csv, the Garmin repair tool at http://garmin.kiesewetter.nl/ will do this for you. I've just loaded the site fine, try again it may have been temporarily down.
To add a little more detail:
"FIT or Flexible and Interoperable Data Transfer is a file format used for GPS tracks and routes. It is used by newer Garmin fitness GPS devices, including the Edge and Forerunner." From the OpenStreetMap Wiki http://wiki.openstreetmap.org/wiki/FIT
There are many tools to convert these files to other formats for different uses, which one you choose depends on the use. GPSBabel is another converer tool that may help. gpsbabel.org (I can't post two links yet :)
This page parses the file and lets you download it as tables. https://www.fitfileviewer.com/ The fun bit is converting the timestamps from numbers to readable timestamps Garmin .fit file timestamp

How can I convert a extremly big .dat file to Image files?(Like jpg or something)

I have a folder of image file which have been compressed into .dat file. Since the .dat files are extremly huge(They are the microscopic image of the organ.), I don't really know what kind of tools that I can use to convert it into jpeg file. So the best case would that the whole image is split up into pieces, and I can get all the pieces of the image.
The ".dat" file suffix is used broadly, so you'll need to specify more details on what format/source software created the original data. As a guess, from a quick search of ".dat" format microscopy, these tools looks like they might be applicable to your domain:
http://gwyddion.net/
or
http://www.openmicroscopy.org/site/products/bio-formats
If you can't find a library for the format/languages you are using, then you'll need to find documentation of the file format, and write a converter (at least, the reading portion of the converter - you can use something like libjpeg to handle the writing portion.)

convert multiple .txt to multiple ascii files fast - possible in Matlab?

I have over 120 .txt files (all named like s1.txt, s2.txt, ..., s120.txt) that I need to convert to ASCII extension to use in MATLAB.
my .txt (comma , delimited .txt) files look like the following:
20080102,43.0300,3,9.493,569.567,34174.027,34174027
20080102,43.0600,3,9.498,569.897,34193.801,34193801
In MATLAB I wish to use code similar to the following:
for i = svec;
%# where svec = [1 2 13 15] some random number between 1 and 120.
eval(['load %mydirectory', eval(['s',int2str(i)]),'.ascii']);
end;
If I am not mistaken I can't use the above command with .txt files and therefore I must use ASCII files.
Since I have a lot of files to convert and they are large in size, is there a quick way to convert all my files via MATLAB, or perhaps there is a great converting software available for Mac on the web? Would anyone have a better suggestion than using the code above?
Adding to nrz's answer:
I'm not sure what you want to do exactly, but know that you can open any file in MATLAB, both as text (ASCII) or in binary mode. The latter can be achieved using fread.
As a side note, you also asked for a better suggestion for your code.
Well, what did you try to achieve with the two eval invocations? Why not call the commands directly? Do this instead:
for i = svec
load (['%mydirectory\s', int2str(i), '.txt'], '-ascii');
end
I also took the liberty to add a backslash that I think you had omitted.
In most cases, you'd be better off without using eval. Check the alternatives...
Can you show an example file? Not every text file is valid for load command. If your file is not in a valid format, changing the extension part of filename from .txt to .ascii doesn't help at all. Instead, in that case the data must be either converted to a valid format for load command or, alternatively, loaded into MATLAB by some other means eg. by using fscanf or xlsread. File structure is needed for both ways to solve this.
See also load command in matlab loading blank file.
A slightly cleaner way:
for i=1:120
fname = fullfile('mydirectory', sprintf('s%d.txt',i));
X = load(fname, '-ascii');
end

I need to write a .DDS file cross-platform, can someone point me to example?

I need to create a .DDS file with code that runs on both OSX and Windows. Although the format doesn't look difficult, I'd still like an example of writing the file. Note I don't need to read it, just write it.
C or C++ and RGBA bitmap.
I finally resorted to written a RAW file, and using GraphicConvertor (mac) to read it and write the DDS file. I think Photoshop can do it too. RAW files are simply RGB or RGBA or similar formats written straight to a binary file. Then in the reading application you tell it the dimensions so it can read it in. Then you export to whatever. Not a perfect solution but it worked for what I needed.

Resources