I'm trying to create a torrent without a tracker. I just want to send some GoPro footage to a friend of mine but I can't get it working.
I've created a torrent with nothing in the tracker field and set the torrent to public, private is unchecked. I sent myself the file to test downloading it on another computer.
I can't get it to work. I've opened my port, when I run the test, it says the port's open on both my laptop and desktop. I've got a static IP on the desktop which is where the torrent was created and is being shared, and I emailed it to myself. I'm running Transmission on both, but one is Windows and one is Mac.
Any ideas why I can't get it to work? DHT is enabled on both.
Transmission is not the best option for that i advice you to use mktorrent faster and more reliable all my time using it it never failed
but if you want to use transmission as a seeder or a uploader client you can use this php script and just edit it to save the torrent files in transmission watch folder or any folder that you want
<?php
define('ROOT_DIR', '/home');
define('SCAN_DIR', ROOT_DIR.'/scan');
define('COMPLETE_DIR', ROOT_DIR.'/complete');
define('TORRENT_DIR', ROOT_DIR.'/torrent');
define('ANNOUNCE_URL', 'YOUR-TRACKER-ANNOUNCE-AND-PASSKEY-HERE');
define('PIECE_SIZE', '21');
function move($source, $dest) {
$cmd = 'mv "'.$source.'" "'.$dest.'"';
exec($cmd, $output, $return_val);
if ($return_val == 0) return 1;
return 0;
}
function make_torrent($file_full, $new_dir, $file) {
$file = pathinfo($file_full, PATHINFO_BASENAME);
$move_file = $new_dir.'/'.$file;
$rez = move($file_full, $move_file);
if (!$rez) die('Cannot move file!');
$info = pathinfo($file);
$output = TORRENT_DIR.'/'.$info['basename'].'.torrent';
if (file_exists($output)) unlink($output);
$cmd = "mktorrent '$move_file' -o '$output'-l".PIECE_SIZE." -a ".ANNOUNCE_URL;
echo $cmd."<br /> <br /> \n \n";
exec($cmd);
if (file_exists($output)) return $output;
else die('Cannot make torrent!');
}
function scan_folder() {
$dir = SCAN_DIR;
$dir_done = COMPLETE_DIR;
if (!is_dir($dir_done))
{
$ok = mkdir($dir_done);
if (!$ok) die('Cannot create destination folder!');
}
$dh = opendir($dir);
while ( $file = readdir($dh) )
{
if ($file == '.' || $file == '..') continue;
$file_full = $dir.'/'.$file;
if ($file_full == COMPLETE_DIR) continue;
make_torrent($file_full, $dir_done, $file);
}
}
scan_folder();
?>
you can ignore the tracker lines or just delete them the script is more or less self explanatory if you need any help please ask away
iCODEiT 0UT
Related
In laravel 5.7 app I need to write csv file and download in in browsers download function using method :
\Response::download(
I try to upload it to tmp directory and checking value I see
$sys_get_temp_dir= sys_get_temp_dir();
line above has /tmp value
Next I save it as :
$dest_csv_file= $sys_get_temp_dir . '/box_rooms_'.time().'.csv';
$header = ["Id",... ];
$fp = fopen($dest_csv_file, "w");
fputcsv($fp, $header);
foreach ($storageSpacesCollection as $line) {
fputcsv($fp, $line);
}
fclose($fp);
But searching on my local ubuntu 18(on server I also have ubuntu) I found generated file as
/tmp/systemd-private-6a9ea6844b9c4c94883d23e4fb3e2215-apache2.service-shqCeo/tmp/box_rooms_1576324869.csv
That was very strange, as I do not know how read it from tmp path. Can it be done?
I think you need to pass Content-Type properly
$filePath= public_path(). "/upload/example.doc";
$headersContent = array('Content-Type: application/pdf',);
return Response::download($filePath, 'example.doc', $headersContent)
I created a duplicate of a download extension from my colleague which is basically an extension which just provides files to download in the back end.
Problem:
If i try to download a file while the extension is only accessible after login to the back end, then it works perfectly fine
however if I open a private browser window where I am not logged in to the back end, then it always cuts off the file and only download the first 40 KB ... even though it is normally 10 MB. Why is the file cut off?
I can download small files ( < 40KB ) perfectly without them getting cutted off.
NOTE:
Before I edited the extension, the download worked perfectly, even if not logged in to the back end! And the download was triggered the same way
Currently I am comparing the code, but from the logic it looks ok, since I did not changed much (added a new model, renamed the extension and some other stuff)
Does someone have a clue what can lead to this problem?
This is the relevant part in my download controller where I first get the public url of the file by passing the fid of the file and then trigger the download by sending headers.
...
if ($this->request->hasArgument('fid')) {
$this->fid = $this->request->getArgument('fid');
}
if ($this->request->hasArgument('cid')) {
$this->cid = $this->request->getArgument('cid');
}
$fileobj = $this->fileRepository->findByUid($this->fid);
if ($fileobj->getFile() !== null) {
$downloadFilePath = $fileobj->getFile()->getOriginalResource()->getPublicUrl();
if (file_exists($downloadFilePath)) {
$fileCounter = (int)$fileobj->getCounter();
$fileobj->setCounter(++$fileCounter);
$oldChecksum = $fileobj->getChecksume();
$groesse = filesize($downloadFilePath);
if (isset($oldChecksum)) {
$checksum = sha1_file($downloadFilePath);
$fileobj->setChecksume($checksum);
}
// update fileobj
$this->fileRepository->update($fileobj);
// Unset fileobj before persists, otherwise there will be also changes
$this->persistenceManager->persistAll();
// If file exists, force download
$fileName = basename($downloadFilePath);
$this->response->setHeader('Content-Type', "application/force-download", TRUE);
$this->response->setHeader('Content-Disposition', 'attachment; filename=' . $fileName, TRUE);
$this->response->setHeader('Content-Length', $groesse, TRUE);
#readfile($downloadFilePath);
$this->response->sendHeaders();
return true; //i can also delete this line, since it is never reached.
} else {
//send emails to everyone who is entered in the address list in the extension configuration.
$this->sendEmails('missing_file', $fileobj);
$this->redirect(
'list',
'Category',
NULL,
array(
'missing' => array(
'fileId' => $this->fid,
'category' => $this->cid
)
)
);
}
}
The 40 KB file does not contain anything that shouldn't be there, it is just cut off. I tested it by writing alot of numbers in a file line by line and download it, result: only a couple thousand numbers are in the file instead of all numbers.
I tried it with both, files stored at a FTP Server and files stored in user_upload, same result.
Here you can see the 40 KB file:
http://pasteall.org/459911
Snippet (in case if the link is down):
<ul>
<li>0</li>
<li>1</li>
<li>2</li>
<li>3</li>
<li>4</li>
<li>5</li>
<li>6</li>
<li>7</li>
<li>8</li>
<li>9</li>
//Cut because stackoverflow does not allow me to post such big texts
...
<li>3183</li>
<li>3184</li>
<li>3185</li>
<li>3186</li>
<li
You can see that it stops downloading the rest, the question is: why?
UPDATE:
I changed it to this:
// If file exists, force download
$fileName = basename($downloadFilePath);
$this->response->setHeader('Content-Type', "application/force-download", TRUE);
$this->response->setHeader('Content-Disposition', 'attachment; filename=' . $fileName, TRUE);
$this->response->setHeader('Content-Length', $groesse, TRUE);
ob_start();
ob_flush();
flush();
$content = file_get_contents($downloadFilePath);
$this->response->setContent($content);
$this->response->sendHeaders();
return true; //i can also delete this line, since it is never reached.
Now the file is downloaded completly, but the file is now wrapped inside the html from the template. it gets rendered inside the fluid variable mainContent.
Like this:
...
<!--TYPO3SEARCH_begin-->
<div class="clearfix col-sm-{f:if(condition:'{data.backend_layout} == 4',then:'12',else:'9')} col-md-{f:if(condition:'{data.backend_layout} == 4',then:'9',else:'6')} col-lg-{f:if(condition:'{data.backend_layout} == 4',then:'10',else:'8')} mainContent">
<f:format.raw>{mainContent}</f:format.raw>
</div>
<!--TYPO3SEARCH_end-->
...
It gets weirder and weirder...
I finally solved the problem. I just had to execute exit or die after sending the headers:
#readfile($downloadFilePath);
$this->response->sendHeaders();
exit;
NOTE: If you exit your code with exit or die then typo3 session set with e.g. $GLOBALS['TSFE']->fe_user->setKey("ses", "token", DownloadUtility::getToken(32)); won't work anymore if not logged in to the backend! Use $GLOBALS['TSFE']->fe_user->setAndSaveSessionData("token", DownloadUtility::getToken(32)); in this case if no log in should be required.
Now it works even if not logged in to the front end.
But I still don't know why the download worked without being cut off while being logged in to the backend, even though the exit statement was missing. Thats extremly weird and we have no explanation.
I'm in the process of writing a PowerShell script to help in the process of setting up new PC's for my work. This will hopefully be used by more than just me so I'm trying to think of everything.
I have offline installers (java, flash, reader, etc) saved on our FTP server that the script downloads if a local copy hasn't already been saved in the Apps directory that gets created. Periodically the files on the FTP server will get updated as new versions of the programs are released. I want the script to have an option of checking for newer versions of the installers in case someone likes to carry around the local copies and forgets to check the server every now and then. It also will need to work in Windows 7 without any need to import additional modules unless there's an easy way to do that on multiple PC's at a time. I know about the import command, but the experiences I've had needed me to copy the module files into multiple places on the PC before it'd work.
Right now I haven't had much luck finding any solutions. I've found code that checks for modified dates on local files, or files on a local server, but nothing that deals with FTP other than uploading\downloading files.
Here's the last thing I tried. I tried a combination of what I found for local files with FTP. Didn't work too well.
I'm new to PowerShell, but I've been pretty good at piecing this whole thing together so far. However, this idea is becoming troublesome.
Thank you for the help.
$ftpsite = "ftp://ftpsite.com/folder/"
$firefox = (Get-Item $dir\Apps\install_firefox.exe).LastWriteTime.toString("MM/dd/yyyy")
if ($firefoxftp = (Get-ChildItem $ftpsite/install_firefox.exe | Where{$_.LastWriteTime -gt $firefox})) {
$File = "$dir\Apps\install_firefox.exe"
$ftp = "ftp://ftpsite.com/folder/install_firefox.exe"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
$webclient.DownloadFile($uri, $File)
}
UPDATE:
Here's what I have after Martin's help. It kind of works. It downloads the file from FTP, but it's not comparing the remote and local correctly. The remote file returns 20150709140505 and the local file returns 07/09/2015 2:05:05 PM. How do I format one to look like the other before the comparison, and is "-gt" the correct comparison to use?
Thanks!
function update {
$ftprequest = [System.Net.FtpWebRequest]::Create("ftp://ftpsite.com/Script_Apps/install_firefox.exe")
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
$localfile = (Get-Item "$dir\Apps\install_firefox.exe").LastWriteTimeUtc
if ($tokens -gt $localfile) {
write-host "Updating Firefox Installer..."
$File = "$dir\Apps\install_firefox.exe"
$ftp = "ftp://ftpsite.com/Script_Apps/install_firefox.exe"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
$webclient.DownloadFile($uri, $File)
"Updated Firefox" >> $global:logfile
mainmenu
}
else {
Write-Host "Local Copy is Newer."
sleep 3
mainmenu
}
}
UPDATE 2:
Seems to be working! Here's the code. Thanks for the help!
function update {
$ftprequest = [System.Net.FtpWebRequest]::Create("ftp://ftpserver.com/Script_Apps/install_firefox.exe")
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
$localtime = (Get-Item "$dir\Apps\install_firefox.exe").LastWriteTimeUtc
if ($code -eq 213) {
$tokens = $tokens[1]
$localtime = "{0:yyyymmddHHmmss}" -f [datetime]$localtime
}
if ($tokens -gt $localtime) {
write-host "Updating Firefox Installer..."
$File = "$dir\Apps\install_firefox.exe"
$ftp = "ftp://ftpserver.com/Script_Apps/install_firefox.exe"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
$webclient.DownloadFile($uri, $File)
"Updated Firefox" >> $global:logfile
mainmenu
}
else {
Write-Host "Local Copy is Newer."
sleep 3
mainmenu
}
}
You cannot use the WebClient class to check remote file timestamp.
You can use the FtpWebRequest class with its GetDateTimestamp FTP "method" and parse the UTC timestamp string it returns. The format is specified by RFC 3659 to be YYYYMMDDHHMMSS[.sss].
That would work only if the FTP server supports MDTM command that the method uses under the cover (most servers do, but not all).
$url = "ftp://ftpsite.com/folder/install_firefox.exe"
$ftprequest = [System.Net.FtpWebRequest]::Create($url)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
if ($code -eq 213)
{
Write-Host "Timestamp is" $tokens[1]
}
else
{
Write-Host "Error" $response
}
It would output something like:
Timestamp is 20150709065036
Now you parse it, and compare against a UTC timestamp of a local file:
(Get-Item "install_firefox.exe").LastWriteTimeUtc
Or save yourself some time and use an FTP library/tool that can do this for you.
For example with WinSCP .NET assembly, you can synchronize whole remote folder with installers with a local copy with one call to the Session.SynchronizeDirectories. Or your can limit the synchronization to a single file only.
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Ftp
$sessionOptions.HostName = "ftpsite.com"
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
$transferOptions = New-Object WinSCP.TransferOptions
# Synchronize only this one file.
# If you remove the file mask, all files in the folder are synchronized:
$transferOptions.FileMask = "install_firefox.exe"
$session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, "$dir\Apps", "/folder",
$False, $False, [WinSCP.SynchronizationCriteria]::Time,
$transferOptions).Check()
To use the assembly, just extract a contents of .NET assembly package to your script folder. No other installation is needed.
The assembly supports not only the MDTM, but also other alternative methods to retrieve the timestamp.
See also a related Powershell example that shows both the above code and other techniques.
(I'm the author of WinSCP)
I am struggling to get control of an IE preview control which is 'Internet Explorer_Server' class on an external windows application with perl.
Internet Explorer_Server is the class name of the window, I've found it with Spy++. and here’s my assertion code of it
$className = Win32::GUI::GetClassName($window);
if ($className eq "Internet Explorer_Server") {
...
}
I can get a handle of that 'Internet Explorer_Server' with Win32::GUI::GetWindow, but have no idea what to do next.
Updated: You are going down the wrong path. What you need is Win32::OLE.
#!/usr/bin/perl
use strict;
use warnings;
use Win32::OLE;
$Win32::OLE::Warn = 3;
my $shell = get_shell();
my $windows = $shell->Windows;
my $count = $windows->{Count};
for my $item ( 1 .. $count ) {
my $window = $windows->Item( $item );
my $doc = $window->{Document};
next unless $doc;
print $doc->{body}->innerHTML;
}
sub get_shell {
my $shell;
eval {
$shell = Win32::OLE->GetActiveObject('Shell.Application');
};
die "$#\n" if $#;
return $shell if defined $shell;
$shell = Win32::OLE->new('Shell.Application')
or die "Cannot get Shell.Application: ",
Win32::OLE->LastError, "\n";
}
__END__
So, this code finds a window with a Document property and prints the HTML. You will have to decide on what criteria you want to use to find the window you are interested in.
ShellWindows documentation.
You may want to have a look at Win32::IE::Mechanize. I am not sure whether you can control an existing IE window with this module, but accessing a single URL should be possible in about five lines of code.
Have you looked at Samie http://samie.sourceforge.net/ as this is a perl module to control IE
I'm trying to port a Perl script over from Unix to Windows but am having a near impossible time getting it to work due to the unsupported forking pipes in the open function. Here's the code:
sub p4_get_file_content {
my $filespec = shift;
return 'Content placeholder!' if ($options{'dry-run'});
debug("p4_get_file_content: $filespec\n");
local *P4_OUTPUT;
local $/ = undef;
my $pid = open(P4_OUTPUT, "-|");
die "Fork failed: $!" unless defined $pid;
if ($pid == 0) { # child
my $p4 = p4_init();
my $result = undef;
$result = $p4->Run('print', $filespec);
die $p4->Errors() if $p4->ErrorCount();
if (ref $result eq 'ARRAY') {
for (my $i = 1; $i < #$result; $i++) {
print $result->[$i];
}
}
$p4->Disconnect();
exit 0;
}
my $content = <P4_OUTPUT>;
close(P4_OUTPUT) or die "Close failed: ($?) $!";
return $content;
}
The error is:
'-' is not recognized as an internal or external command,
operable program or batch file.
Does anyone know how to make this work? Thanks!
Mike
I know it's not a direct answer to your question, but it looks like you're scripting something on top of Perforce in Perl? If so you might find an existing library does what you want already and save yourself a lot of headaches, or at least give you some sample code to work from.
For example:
P4Perl
P4::Server
P4::C4
EDIT: Now that I know what you're doing I'm guessing you're trying to port p42svn to Windows, or rather make it compatible with Windows at least. See this thread for a discussion of this exact issue. The recommendation (untested) is to try the code samples listed at http://perldoc.perl.org/perlfork.html under "Forking pipe open() not yet implemented" to explicitly create the pipe instead.
It's not going to work as-is. You'll need to find another method to accomplish what it's doing. It doesn't look like there's that burning a need for the fork-pipe, but it's hard to tell since I don't know what a p4 is and a lot of your code is being lost to angle bracket interpretation.