How can use FTP in OBEX to deleting and copying a file into destination Device via bluetooth java code? - ftp

I need to replace a file with old version of it in destination device via bluetooth.
I know that OBEX(FTP and OPP) profiles is necessary to use for this.
But I don't know How can delete old version and copy new version of file in destination directory (java code).
Can you help me please?

To perform operations on files you whould firstly change to the directory where the file is.
For exampe, if you need to get to /root/directory/subdir/
you should call setPath three times
setPath(""); // to get to /root/
setPath("directory") // get to /root/directory/
setPath("subdir") // get to root/directory/subdir/
All the code written below is for J2ME
I use this method to set the path with separators (e.g. /root/dir/)
private void moveToDirectory(String dir) throws IOException {
RE r = new RE("/"); // where RE is me.regexp.RE
setDir("");
String[] dirs = r.split(dir);
for (int i = 1; i < dirs.length; i++) setDir(dirs[i]);
}
To delete a file you should open a PUT operation on it and close it, or use the delete method in ClientSession.
public void delete() throws IOException {
HeaderSet hs = cs.createHeaderSet(); // where cs is an opened ClientSession
hs.setHeader(HeaderSet.NAME, file); // file - is a filename String, no slashes should be used
cs.delete(hs);
}
If you need to replace a file you probably don't need to call delete method, just open OutputStream and write to it a new one
public OutputStream openOutputStream() throws IOException {
HeaderSet hs = cs.createHeaderSet();
hs.setHeader(HeaderSet.NAME, file);
Operation op = cs.put(hs); // Operation should be global, so you can close it after you done
return op.openOutputStream();
}
remember to close Operation after you done with the streams.

Related

How to list all children in Google Drive's appfolder and read file contents with Xamarin / c#?

I'm trying to work with text files in the apps folder.
Here's my GoogleApiClient constructor:
googleApiClient = new GoogleApiClient.Builder(this)
.AddApi(DriveClass.API)
.AddScope(DriveClass.ScopeFile)
.AddScope(DriveClass.ScopeAppfolder)
.UseDefaultAccount()
.AddConnectionCallbacks(this)
.EnableAutoManage(this, this)
.Build();
I'm connecting with:
googleApiClient.Connect()
And after:
OnConnected()
I need to list all files inside the app folder. Here's what I got so far:
IDriveFolder appFolder = DriveClass.DriveApi.GetAppFolder(googleApiClient);
IDriveApiMetadataBufferResult result = await appFolder.ListChildrenAsync(googleApiClient);
Which is giving me the files metadata.
But after that, I don't know how to read them, edit them or save new files. They are text files created with my app's previous version (native).
I'm following the google docs for drive but the Xamarin API is a lot different and has no docs or examples. Here's the API I'm using: https://components.xamarin.com/view/googleplayservices-drive
Edit:
Here is an example to read file contents from the guide:
DriveFile file = ...
file.open(mGoogleApiClient, DriveFile.MODE_READ_ONLY, null)
.setResultCallback(contentsOpenedCallback);
First I can't find anywhere in the guide what "DriveFile file = ..." means. How do I get this instance? DriveFile seems to be a static class in this API.
I tried:
IDriveFile file = DriveClass.DriveApi.GetFile(googleApiClient, metadata.DriveId);
This has two problems, first it complains that GetFile is deprecated but doesn't say how to do it properly. Second, the file doesn't have an "open" method.
Any help is appreciated.
The Xamarin binding library wraps the Java Drive library (https://developers.google.com/drive/), so all the guides/examples for the Android-based Drive API work if you keep in mind the Binding's Java to C# transformations:
get/set methods -> properties
fields -> properties
listeners -> events
static nested class -> nested class
inner class -> nested class with an instance constructor
So you can list the AppFolder's directory and files by recursively using the Metadata when the drive item is a folder.
Get Directory/File Tree Example:
await Task.Run(() =>
{
async void GetFolderMetaData(IDriveFolder folder, int depth)
{
var folderMetaData = await folder.ListChildrenAsync(_googleApiClient);
foreach (var driveItem in folderMetaData.MetadataBuffer)
{
Log.Debug(TAG, $"{(driveItem.IsFolder ? "(D)" : "(F)")}:{"".PadLeft(depth, '.')}{driveItem.Title}");
if (driveItem.IsFolder)
GetFolderMetaData(driveItem.DriveId.AsDriveFolder(), depth + 1);
}
}
GetFolderMetaData(DriveClass.DriveApi.GetAppFolder(_googleApiClient), 0);
});
Output:
[SushiHangover.FlightAvionics] (D):AppDataFolder
[SushiHangover.FlightAvionics] (F):.FlightInstrumentationData1.json
[SushiHangover.FlightAvionics] (F):.FlightInstrumentationData2.json
[SushiHangover.FlightAvionics] (F):.FlightInstrumentationData3.json
[SushiHangover.FlightAvionics] (F):AppConfiguration.json
Write a (Text) File Example:
using (var contentResults = await DriveClass.DriveApi.NewDriveContentsAsync(_googleApiClient))
using (var writer = new OutputStreamWriter(contentResults.DriveContents.OutputStream))
using (var changeSet = new MetadataChangeSet.Builder()
.SetTitle("AppConfiguration.txt")
.SetMimeType("text/plain")
.Build())
{
writer.Write("StackOverflow Rocks\n");
writer.Write("StackOverflow Rocks\n");
writer.Close();
await DriveClass.DriveApi.GetAppFolder(_googleApiClient).CreateFileAsync(_googleApiClient, changeSet, contentResults.DriveContents);
}
Note: Substitute a IDriveFolder for DriveClass.DriveApi.GetAppFolder to save a file in a subfolder of the AppFolder.
Read a (text) File Example:
Note: driveItem in the following example is an existing text/plain-based MetaData object that is found by recursing through the Drive contents (see Get Directory/File list above) or via creating a query (Query.Builder) and executing it via DriveClass.DriveApi.QueryAsync.
var fileContexts = new StringBuilder();
using (var results = await driveItem.DriveId.AsDriveFile().OpenAsync(_googleApiClient, DriveFile.ModeReadOnly, null))
using (var inputStream = results.DriveContents.InputStream)
using (var streamReader = new StreamReader(inputStream))
{
while (streamReader.Peek() >= 0)
fileContexts.Append(await streamReader.ReadLineAsync());
}
Log.Debug(TAG, fileContexts.ToString());

Spring Batch ~ Dynamic commit interval or a custom completion policy

What I have?
Spring Integration that watch recursively a folder for new CSV's files; and send them back to Spring batch.
The job: read the CSV file; in the processor, I modify some data in the items; then I use a custom writer to save my data on the DB.
Problem?
In fact that I have dynamic number of CSV beeing send to the batch. I want that my job commit interval will be based on the number of items (lines) present in the CSV's file. In other way, I don't want to commit my data in every fixed number of item, but every end of file. Exemple: CSV 1 have 200 Lines, I want to process all the lines, writes them, commit, close the transaction then read the next CSV.
I have two idea, but I didn't know whoch is the perfect and how to implement it:
Get from the reader the number of lines in my CSV and send it to my commit interval using a job parameter argument like so #{jobParameters['commit.interval.value']}
Implement a Custom Completion Policy to replace my commit inteval, how to implement isComplete() Do you have any exemples? Github project?
But before all that, how can I get the number of items?
Could any one helps me? a code sample maybe?
Thank you in advance.
No answer, but I found a solution
I'm using a Dynamic commit interval instead of a completion policy.
With Spring batch integration, I can use a transformer to send my file to the batch, for that I have a custom class FileMessageToJobRequest in that one I added this function that helps me to get the count lines
public static int countLines(String filename) throws IOException {
InputStream is = new BufferedInputStream(new FileInputStream(filename));
try {
byte[] c = new byte[1024];
int count = 0;
int readChars = 0;
boolean empty = true;
while ((readChars = is.read(c)) != -1) {
empty = false;
for (int i = 0; i < readChars; ++i) {
if (c[i] == '\n') {
++count;
}
}
}
return (count == 0 && !empty) ? 1 : count;
} finally {
is.close();
}
}
and this one to send parameters
#Transformer
public JobLaunchRequest toRequest(Message<File> message) throws IOException{
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString("commit.interval", Integer.toString(countLines(message.getPayload().getAbsolutePath())));
return new JobLaunchRequest(job, jobParametersBuilder.toJobParameters());
}
and in my job context, I just added this commit-interval="#{jobParameters['commit.interval']}"
Hope it help someone in need ;)

NotifyFilter of FileSystemWatcher not working

I have a windows service (and verified the code by creating a similar WinForms application) where the NotifyFilter doesn't work. As soon as I remove that line of code, the service works fine and I can see the event-handler fire in the WinForms application.
All I'm doing is dropping a text file into the input directory for the FileSystemWatcher to kick off the watcher_FileChanged delegate. When I have the _watcher.NotifyFilter = NotifyFilters.CreationTime; in there, it doesn't work. When I pull it out, it works fine.
Can anyone tell me if I'm doing something wrong with this filter?
Here is the FSW code for the OnStart event.
protected override void OnStart(string[] args)
{
_watcher = new FileSystemWatcher(#"C:\Projects\Data\Test1");
_watcher.Created += new FileSystemEventHandler(watcher_FileChanged);
_watcher.NotifyFilter = NotifyFilters.CreationTime;
_watcher.IncludeSubdirectories = false;
_watcher.EnableRaisingEvents = true;
_watcher.Error += new ErrorEventHandler(OnError);
}
private void watcher_FileChanged(object sender, FileSystemEventArgs e)
{
// Folder with new files - one or more files
string folder = #"C:\Projects\Data\Test1";
System.Console.WriteLine(#"C:\Projects\Data\Test1");
//Console.ReadKey(true);
// Folder to delete old files - one or more files
string output = #"C:\Temp\Test1\";
System.Console.WriteLine(#"C:\Temp\Test1\");
//Console.ReadKey(true);
// Create name to call new zip file by date
string outputFilename = Path.Combine(output, string.Format("Archive{0}.zip", DateTime.Now.ToString("MMddyyyy")));
System.Console.WriteLine(outputFilename);
//Console.ReadKey(true);
// Save new files into a zip file
using (ZipFile zip = new ZipFile())
{
// Add all files in directory
foreach (var file in Directory.GetFiles(folder))
{
zip.AddFile(file);
}
// Save to output filename
zip.Save(outputFilename);
}
DirectoryInfo source = new DirectoryInfo(output);
// Get info of each file into the output directory to see whether or not to delete
foreach (FileInfo fi in source.GetFiles())
{
if (fi.CreationTime < DateTime.Now.AddDays(-1))
fi.Delete();
}
}
I've been having trouble with this behavior too. If you step through the code (and if you look at MSDN documenation, you'll find that NotifyFilter starts off with a default value of:
NotifyFilters.FileName | NotifyFilters.DirectoryName | NotifyFilters.LastWrite
So when you say .NotifyFilter = NotifyFilters.CreationTime, you're wiping out those other values, which explains the difference in behavior. I'm not sure why NotifyFilters.CreationTime is not catching the new file... seems like it should, shouldn't it!
You can probably just use the default value for NotifyFilter if it's working for you. If you want to add NotifyFilters.CreationTime, I'd recommend doing something like this to add the new value and not replace the existing ones:
_watcher.NotifyFilter = _watcher.NotifyFilter | NotifyFilters.CreationTime;
I know this is an old post but File Creation time is not always reliable. I came across a problem where a Log file was being moved to an archive folder and a new file of the same name was created in it's place however the file creation date did not change, in fact the meta data was retained from the previous file (the one that was moved to the archive) .
Windows has this cache on certain attributes of a file, file creation date is included. You can read the article on here: https://support.microsoft.com/en-us/kb/172190.

How to design my mapper?

I have to write a mapreduce job but I dont know how to go about it,
I have jar MARD.jar through which I can instantiate MARD objects.
Using which I call the mard.normalize file meathod on it i.e. mard.normaliseFile(bunch of arguments).
This inturn creates certain output file.
For the normalise meathod to run it needs a folder called myMard in the working directory.
So I thought that I would give the myMard folder as the in input path to hadoop job, but m not sure if that would help beacuse mard.normaliseFile(bunch of arguments) will search for the myMard folder in the working directory but it will not find it as (**this is what I think) the Mapper will only be able to access the content of files through the "values" obtained from the fileSplit, it cannot give direct access to the files in the myMard folder.
In short I have to execute the follwing code through the MapReduce
File setupFolder = new File(setupFolderName);
setupFolder.mkdirs();
MARD mard = new MARD(setupFolder);
Text valuz = new Text();
IntWritable intval = new IntWritable();
File original = new File("Vca1652.txt");
File mardedxml = new File("Vca1652-mardedxml.txt");
File marded = new File("Vca1652-marded.txt");
mardedxml.createNewFile();
marded.createNewFile();
NormalisationStats stats;
try {
stats = mard.normaliseFile(original,mardedxml,marded,50.0);
//This meathod requires access to the myMardfolder
System.out.println(stats);
} catch (MARDException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Please help

How do you save images to a Blackberry device via HttpConnection?

My script fetches xml via httpConnection and saves to persistent store. No problems there.
Then I loop through the saved data to compose a list of image url's to fetch via queue.
Each of these requests calls the httpConnection thread as so
...
public synchronized void run()
{
HttpConnection connection = (HttpConnection)Connector.open("http://www.somedomain.com/image1.jpg");
connection.setRequestMethod("GET");
String contentType = connection.getHeaderField("Content-type");
InputStream responseData = connection.openInputStream();
connection.close();
outputFinal(responseData, contentType);
}
public synchronized void outputFinal(InputStream result, String contentType) throws SAXException, ParserConfigurationException, IOException
{
if(contentType.startsWith("text/"))
{
// bunch of xml save code that works fine
}
else if(contentType.equals("image/png") || contentType.equals("image/jpeg") || contentType.equals("image/gif"))
{
// how to save images here?
}
else
{
//default
}
}
What I can't find any good documentation on is how one would take the response data and save it to an image stored on the device.
Maybe I just overlooked something very obvious. Any help is very appreciated.
Thanks
I tried following this advise and found the same thing I always find when looking up BB specific issues: nothing.
The problem is that every example or post assumes you know everything about the platform.
Here's a simple question: What line of code writes the read output stream to the blackberry device? What path? How do I retrieve it later?
I have this code, which I do not know if it does anything because I don't know where it is supposedly writing to or if that's even what it is doing at all:
** filename is determined on a loop based on the url called.
FileOutputStream fos = null;
try
{
fos = new FileOutputStream( File.FILESYSTEM_PATRIOT, filename );
byte [] buffer = new byte [262144];
int byteRead;
while ((byteRead = result.read (buffer ))!=- 1)
{
fos.write (buffer, 0, byteRead);
}
fos.flush();
fos.close();
}
catch(IOException ieo)
{
}
finally
{
if(fos != null)
{
fos.close();
}
}
The idea is that I have some 600 images pulled from a server. I need to loop the xml and save each image to the device so that when an entity is called, I can pull the associated image - entity_id.png - from the internal storage.
The documentation from RIM does not specify this, nor does it make it easy to begin figuring it out.
This issue does not seem to be addressed on this forum, or others I have searched.
Thanks
You'll need to use the Java FileOutputStream to do the writing. You'll also want to close the connection after reading the data from the InputStream (move outputFinal above your call to close). You can find all kinds of examples regarding FileOutputStream easily.
See here for more. Note that in order to use the FileOutputStream your application must be signed.

Resources