External Methods for Plone/Zope - methods

I have two instances of Plone running on a server - their locations are /usr/local/Plone/Inst1 and /usr/local/Plone/Inst2. I'm trying to setup external methods, but am having a difficult time figuring out where my "Extensions" folder should be placed. It seems that where ever I place it, the ZMI never sees it. I have tried:
/usr/local/Plone/Extensions
/usr/local/Plone/Inst1/Extensions
...and various other sub-folders within the "Inst1" directory. When I add an external method, I've tried adding it at both the root of the ZMI ("/") and in the Instance folder ("/Inst1/"). When I add it to the root, I get a message that says "The specified module, demo, could not be found." When I try to add it in the instance folder, I get an error page that says "This page does not seem to exist..."
It appears that the ZMI is failing to find that file ("demo.py"). Is there a particular place it should be stored?
After looking over documentation from Zope's site (and numerous tutorials), it should be in the "Zope" folder - but I don't have any such folder (nor does any folder named "Extensions" exist on the file system, other than the one I created in the "Plone" and "Plone/Inst1" directories).

Depending on what version of Plone you have installed will change this, but it is likely something like:
/path/to/plone/install/parts/instance/Extensions
Or
/path/to/plone/install/parts/client1/Extensions
You're probably better off putting the external method in a product though since placing an external method in it's "parts" instance folder will mean it will be wiped out every time you run buildout. Then if you're going to make a product out of it, you might as will do a traversable view utility like "##plone_context_state" and "##plone_portal_state" which is usually a better way to do it.

Related

Is there a way to have multiple files with same backing data in macOS FileProvider extension?

I'm creating a macOS FileProviderExtension for the remote Document Storage System (kind of like GoogleDrive), where it is possible to share a single document with multiple folders.
For example, Document1.pdf can simultaneously exist in Folder A and Folder B because it's shared with both folders. In my FileProvider extension, this would mean that file should be accessible in both folders:
Folder A/Document1.pdf
Folder B/Document1.pdf
But the file provider extension will treat those as two completely separate files. I.e., if you download one of them, and then try to open the other one, it will redownload the other one, effectively doubling the used space on user's disk and consuming network connection.
I'm looking for a way to tell the FileProviderItem what is the backing data for the given file, and thus solve problems such as:
If user downloads a file in one location, ideally I would tell the FileProvider extension that the same document in all the other locations is also now downloaded (cloud icon should disappear from all files).
Some approaches I considered:
I thought of using symbolic links as part of solution, but I don't really think that's possible
When user tries to open non-downloaded file, fetchContents(for itemIdentifier) callback is invoked. Once file is downloaded, I would ideally now notify all the other files of the same document that they are downloaded, i.e. by updating the isDownloaded property in NSFileProviderItem, but that doesn't seem to work. Also, even if I do that, I still can't say to file, what his backing data file should be.
By turning off the Sandbox capability, I guess I could, when user tries to download/open the file which has already been downloaded in other location, immediately report that file has been downloaded and provide the copy of already downloaded file as data for the requested file, but there are two drawbacks here:
3.1. I would have to turn off the Sandbox capability because I want to access the file in FileProvider path directly
3.2 System would still use disk space for each file. So, if I have same document in multiple folders, extension would keep all those copies in the system, without the option to tell it that for all those files, there is same backing data file somewhere in extension's Container.

Umbraco Imagegen.ashx and keeping images in another folder

I have an umbraco installation that uses imagegen.ashx.
I would like to deploy it using octopus deploy, this is pretty simpel and is already working.
My problem is that octopus deploys to a new folder (with a version number) each time, which imposes a problem with the media folder, since it can be changed in every folder.
I have made a shared folder at the same level as the versions, and I have made a virtual directory in IIS.
If I access the files directly through a browser - they exist and everything is fine.
But if I use imagegen.ashx it does not work, I have tried setting the imagebasedir property like so:
<Class Name="default" OverridesQueryString="true">
<AllowUpsizing>false</AllowUpsizing>
<MaxHeight>800</MaxHeight>
<MaxWidth>800</MaxWidth>
<ImageBaseDir>D:\Octopus\Applications\customer Test\customer\Shared\ </ImageBaseDir>
</Class>
What am i doing wrong?
ImageBaseDir is unlikely to expect a server mapped path.
The imagegen documentation shows examples that are either relative to the site, or from a fully-qualified url from another site.
Mapping to a path outside of your webroot will likely cause permissions issues for a number of reasons i wont get into here...though i see that you have set up the Media folder as a virtual directory. Good start, however it appears that you're mapping to the d:\ drive instead of the virtual directory you set-up (Presumably this is /Media ?).
You may also look into how the imagegen "Cached" folders are respected by your Octopus set-up. If the imagegen generated images/files get munged, that's a no-good-situation that could leave you with the appearance of missing images:
/(media-virtual-directory)/99999/Cached/index.xml could state that the generated image is # "df1rt0lr.png" -but if that got removed in the deploy process you'll see the missing image behaviour.
Have you used the AltImage property to specify a fallback to a known, always available image? This will help tell if imagegen is throwing errors before outputting your expected result.
I don't believe it's an imagegen issue however, it's a pretty mature product and it's well put together as well being well documented. I would look at using a relative path for ImageBaseDir - or better yet, not using this attribute at all but rather just having your /Media folder being in a constant location as your virtual directory.

How to make WIX create files to Program Files folder in the installation? I have "Access defined"

I am creating a WIX installer project. During one managed customized action, I need to create a file (other than the deployed files specified in the components of WIX) in the installation folder, which by default is the Program Files folder. I am experiencing the "Access denied" problem in Windows 7. After some searching, I found out that people say it is not advisable to create files into Program Files folder. Instead, try to create files into for example AppData folder. For example, see this link:
C# Access denied to path in a Windows Application
But my question is, the generated file is crucial to our SW, so it must reside in the installation folder. Isn't it the target of SW installation, I mean, to create file in most of the cases Program Files folder? Does it mean the only files should be added into installation folder, during the installation, are the deployed files (basically the targets of XCopy)?
My file can't be made deploy-able in the WIX, i.e, it can't be made ready before the installation. So what's the proper way or best practice to handle such situation: a file must be generated during the installation, into the installation folder. It is not some log file that I can put somewhere else. I tried to create a Permission element in WIX for the INSTALLADIR, although it seems to be against the rule mentioned in the link, but it still failed. Thanks!
UPDATE:
Based one MichaelUrman's commen, some more information. The generated file is needed after the SW is installed and necessary during normal launch of the SW. And I think it needs to be modified during normal use after the installation. And as I mentioned my a comment to #caveman_dick answer, my CA is actually in commit phase, I don't know whether there is any difference between it and normal deferred CA
Set the custom action to Execute="deferred", that will run the command elevated and should give it the required permissions to create the file.
Since you need to update that file from the main application, and I'm assuming your application does not require elevated privileges, you have three options.
The first is the worst: without a manifest, your executable's attempts to write to the Program Files folder will typically result in it being redirected to the Virtual Store (see File Virtualization). It sounds like this isn't happening in your case, so you can't use it.
The second option is to modify the application to store this in an appropriate location such as the ProgramData folder, or Common Documents, or (if appropriate) a per-user location under LocalAppData. This is typically the best approach, but has the highest development costs.
Finally the third option is to create the file and change its permissions (or in some cases to change the permissions on the folder containing the file), allowing limited users to modify this file. See LockPermissions or MsiLockPermissionsEx for the Windows Installer way to approach this. Change the permissions on as few files or folders, as restricted as possible, to keep the system as safe as possible if you go with this option.

IFileOperation::MoveItems not working on namespace extension root

So I have this rooted (on a specific file type) namespace extension that supports drag-and-dropping files into it. I use IFileOperation to handle file operations.
Moving/Copying a file that comes from outside the NSE into a sub-folder works. Copying a file into the namespace root works. However, I can't get to move a file into the root. It just does. Not. Work. My extension is never inquired for a ITransferDestination, although it is for other use cases.
Have you ever been in this situation ?
Notes:
I'm building the extension on top of Bjarke Viksoe's great TarFolder codebase.
The error I'm getting (through the standard Explorer dialog) is "The file is already in use"
the copy engine seems to end up deciding I'm trying to move a file from the regular file system to the regular file system. The above error is the one returned by a call to MoveFileEx, from what I could gather by tracing in there.
the PIDLs look correct, and IFileOperation::CopyItems works for the same inputs anyway.
I'd like to stick with IFileOperation, as it provides the most natural integration
Try contacting Bjarke directly, he may have some insight as to what may be going on.
His email:
bjarke#viksoe.dk
I personally haven't worked with his frameworks before, sorry I can't be of any more help than this.

Is there a way in Visual Studio 2010 to have code/project/folder structure mapped from TFS to mutiple machines?

I am currently working on several applications and in n some of these apps the solutions contain projects from multiple work spaces in Visual Studio 2010.
This causes an issue when others attempt to work on the code for a certain application or simply download the code and run the app. I have my work spaces for my computer defined, but others do not.
What I want to do is have a way to set up a work space or sort of work space template where anyone can download the code from the server, while on any machine and with the required folder structure, and the application will run.
For instance, if I had the following server structure:
$/
$/SolutionFiles/
$/SubFolder1/
$/SubFolder1/ProjectA/
$/SubFolder1/ProjectB/
$/SubFolder2/
$/SubFolder2/Project1/
$/SubFolder2/Project2/
...and I had a solution $/SolutionFiles/MyAppSolution that contains code from $/SubFolder/ProjectA/ and $/SubFolder2/Project1/, I want to have a separate workspace or something, possibly named "MyAppSolution_Workspace" or something like that, that will map the solution folder and the related project folders to a generic path. This would need to work and be accessible from all other separate computers and would need to keep the same directory structure from the server and have the same folder names and everything as is expected by the solution/project files.
From what it looks like Shared work spaces in VS2010 would work, but it seems to only apply to one machine and is not entirely generic.
Are there any suggestions for how to accomplish this?
You probably won't like this answer but here is what we do:
Keep your workspace mapping as simple as possible. Never move folders or rename folders or files using the workspace mapping. Then all users need to do is map from the top and everything works. They may get more files than they want but it will work.
Advanced users can use the folder cloaking ability to block getting folders they aren't interested in or can map just the folders they want. The key being that when they map specific folders instead of the root they leave the paths the same as they would be if the root was mapped.
In our system we then have our version and release branching structure above the folders of your example. So everything you've listed would be duplicated inside a MAIN folder and inside a Release_1 folder.

Resources