Magento 1.7 Local to Remote server transfer - magento

After transferred my magento local site into remote ... i get errors as
"a:5:{i:0;s:45:"Unable to read response, or response is empty";i:1;s:1151:"#0 C:\xampp\htdocs\magento\magento-1.7.0.2\magento\lib\Varien\Http\Client.php(61): Zend_Http_Client->request('GET')"
The remote site still holding the path of my local server path how to change it . Is there any default config settings .

To a large degree this would depend on exactly which files you uploaded to the remote site, the most likely suspects are probably var/resource_config.json or anything in var/cache/mage--*. I believe you should be able to delete both, but if your a more cautious type you can always rename them.

Related

Cannot access files on FTP server from Azure Data Factory

I currently have access to a third party's FTP server which, upon login, automatically redirects me to a directory that does not contain the files I am trying to download.
ftp://ftp.fakehost.com -> ftp://ftp.fakehost.com/uselessDir
My files are in ftp://ftp.fakehost/usefulDir.
This ftp server does not support directory traversal so I cannot get to usefulDir by simply modifying my url. FileZilla works since I can execute specific ftp commands to get to the directory I want.
Can a Data Factory FTP service or dataset be customized to work around this problem since Data Factory cannot access the usefulDir directly ?
Please correct me if I doesn't understand your question correctly. Have you tried create a dataset and manually put the usefulDir in folderPath property directly, instead of using the Authoring UI to navigate to that folder (which is not possible based on your description.)

Not Able to Find SCCM MPcontrol log location in Site Server

I am not able find the mpcontrol.log in one of site server.
Searched in location E:\SMS_CCM\Logs ,but failed get the log.Actually What is the purpose of the folder SMS_CCM?
[Please check below screenshots]
From which drive and which folder i can get the log.
Check E:\SMS\Logs.
Note that the mpcontrol.log file will only exist on your management point, not your distribution points (not sure if they're one and the same in your infrastructure, just going by the info provided)

What are the final step for transporting joomla project from server to computer

I tried to transport a joomla! project from an online server to my local WAMP server. I followed this tutorial of how to do it and everything went well. I downloaded all the joomla! files from servers public_html folder, then I set up the database (dowloaded .sql file from phpMyAdmin, and exported it to my local wamp sql server), then I set up the configuration.php file as in the tutorial (changed the $log_path and
$tmp_path, and everything else stayed the same.) Then it worked fine, localhost loaded all the website, all links worked, it just threw out strickt standart and notice errors, so I just turned them off - in config file $error_reporting value changed to "E_ALL | E_STRICT". And then everything worked except the user accounts. I can't log into the joomla/administrator, I can't register to website, and it seems it doesn't store / take the data from database. What should I do to? It clearly has connection to it, because there is no more error about that. What else should I do? Any help would be much appreciated and if there is need for more information, please just ask, I'll give it.

Trouble Uploading Large Files to RStudio using Louis Aslett's AMI on EC2

After following this simple tutorial http://www.louisaslett.com/RStudio_AMI/ and video guide http://www.louisaslett.com/RStudio_AMI/video_guide.html I have setup an RStudio environment on EC2.
The only problem is, I can't upload large files (> 1GB).
I can upload small files just fine.
When I try to upload a file via RStudio, it gives me the following error:
Unexpected empty response from server
Does anyone know how I can upload these large files for use in RStudio? This is the whole reason I am using EC2 in the first place (to work with big data).
Ok so I had the same problem myself and it was incredibly frustrating, but eventually I realised what was going on here. The default home directory size for AWS is less than 8-10GB regardless of the size of your instance. As this as trying to upload to home then there was not enough room. An experienced linux user would not have fallen into this trap, but hopefully any other windows users new to this who come across this problem will see this. If you upload into a different drive on the instance then this can be solved. As the Louis Aslett Rstudio AMI is based in this 8-10GB space then you will have to set your working directory outside this, the home directory. Not intuitively apparent from Rstudio server interface. Whilst this is an advanced forum and this is a rookie error I am hoping no one deletes this question as I spent months on this and I think someone else will too. I hope this makes sense to you?
Don't you have shell access to your Amazon server? Don't rely on RStudio's upload (which may have a 2Gb limit, reasonably) and use proper unix dev tools:
rsync -avz myHugeFile.dat amazonusername#my.amazon.host.ip:
on your local PC command line (install cygwin or other unixy compatibility system) will transfer your huge file to your amazon server, and if interrupted will resume from that point, will compress the data for transfer too.
For a windows gui on something like this, WinSCP was what we used to do in the bad old days before Linux.
This could have something to do with your web server. Are you using nginx or apache as your web server. If so you can modify the upload feature in your nginx server. If you are running nginx on the front end of the web server I would recommend the following fix in your nginx.conf file.
http {
...
client_max_body_size 100M;
}
https://www.tecmint.com/limit-file-upload-size-in-nginx/
I had a similar problems with a 5GB file. What worked for me was to use SQLite to create a database with the csv file that I needed. Use SQLite code to bring create the database. Then I used a function in RStudio to communicate with the local database. In that way, I was able to bring in the csv file. I can track down the R code that I used if you like.

Nexus OOS - Clean up the proxy/attributes/

I got a Nexus OOS instance with the following settings:
proxy of the http://repo1.maven.org/maven2/
I override the "local storage location" with a path to a network device
Everything is ok and my Nexus instance works fine... but I notice the number of inodes grows a lot.
After a little check, I can tell every inodes come from the proxy/attributes/ directory.
According to the documentation:
Stores data about the files contained in a remote repository. Each
proxy repository has a subdirectory in the proxy/attributes/ directory
and every file that Nexus has interacted with in the remote repository
has an XML file which captures such data as the: last requested
timestamp, the remote URL for a particular file, the length of the
file, and the digests for a particular file among other things. If you
need to backup the local cached contents of a proxy repository, you
should also back up the contents of the proxy repository's directory
under proxy/attributes/.
Ok I understand why there is a lot of little files in this location but I have a dummy question: to avoid to reach my inode limit, could I periodically clean up the content of proxy/attributes/, without breaking anything and does these files will be recreated 'on demand' if needed?
I find nothing about it...
Any clue will be greatly appreciated!
You can find details on the contents of the working folder here: https://docs.sonatype.com/display/SPRTNXOSS/Nexus+Workspace+Directories+Analysis
The part you're specifically interested in is this:
/Proxy: This folder contains an "attributes" subfolder, that holds as
many subfolders as many repos you have (repoId is the name of
these). This is the place where "item attributes" are persisted as lots
of very small files. These files contain information about expiration
status and are consulted during proxying. Therefore, they have impact
on proxy and group lookup speed if stored on a slow disk. These files
are recreated on demand if they are missing or corrupted and thus
don't need to be backed up.
Hope that helps, if you need more realtime assistance, feel free top hop onto the user list or irc: http://nexus.sonatype.org/project-information.html

Resources