Slack exports its data as a ZIP, without images.
It uses public URLs instead, so on my imported chanels inside Rocket, I see only links to slack instead of the inline images.
Is there any way to instruct Rocket to automatically download all those images and embed them in the channels?
What would be the best way to do that?
Currently, RocketChat cannot automatically download and import the Slack attachments. There are several 3rd party and open source tools to separately download the referenced attachments in the Slack export files. One example is slack-advanced-exporter.
Then, you have all the attachments in a separate folder. This is still not enough to import the Slack messages with the attachments into RocketChat. An additional converter would be helpful to replace the urls in the Slack JSONs with the correct file links and create the correct folder structure for the attachments. As far as I know, this additional converter is still missing (at least in the open source landscape).
Related
I'm building an app where users can share download links to files. These files are served using golang's http.ServeContent, so they are sent as is, without any HTML. However, when these files are shared on social media platforms or a messaging service, I want to be able to display an image à-la Open Graph.
Is it possible to have Open Graph metadata tags show up for these non-HTML pages?
If it's not, is there any way to embed this content in an HTML5 page while still triggering a download of the file (and not the HTML page) when used with something like, e.g., curl?
Follow up question if none of these are possible, is there anything else I could use to have an image and a title show up when my link is shared?
I suggest not linking the direct file, but having an actual download page for them, so that the file is not linked directly, but its download page.
On the download page you could then implement the appropriate share buttons and initiate the download through a bit of JavaScript.
Alternatively you could inspect if a bot (like facebook, telegram, skype, etc) is visiting the files location and then display the appropriate open graph or twitter headers.
Example of a user agent parser: https://github.com/mssola/user_agent
I want to generate the whole action JSONs in code, and then upload them, instead of working through the console.
There is an option to download the whole package as a zipped JSON and theoretically you can also upload. That might supply a hint on how to create the JSONs.
However, these files have all kinds of IDs for the different building blocks, such as Intents or Entity. So, when I export from DialogFlow, of course I get IDs for these.
But, if I want to create a new google action, do I generate these action IDs myself?
Is there documentation on what the structure of these JSONs should be?
The format used by the export/import process is not documented, and while you can try to work with it, there is no guarantee that it will continue to work in the future.
Depending on your needs, it may be better to work with the Dialogflow API (the former API of API.ai). This provides an API to build and modify Intents and Entities (and do some other things). It isn't clear, however, that this provides access to the settings for various integrations.
I've tried some of the services out there, including droplet, ctrlq.org/save, and some other sites that support directly fetching a file from a url and uploading it to dropbox, google drive and the like. Without the user having to store the file on a local disk.
Now the problem is none of these services support multiple urls or batch uploading, but I have quite a few urls and I really need a service where I can put them in, split them with enters or semicolons, and have the files uploaded to dropbox.(or any other cloud storage)
Any help would be gladly appreciated.
The Dropbox Saver JavaScript control allows you to save up to 100 files to the user's Dropbox in one shot. You'll need to programmatically create the button using Dropbox.createSaveButton as explained in the linked page.
It seems like the 100-file limit (at any one time) is universal, but you might find that it isn't the case when using the DropBox REST API. It looks possible to do this with NodeJS server side (OAuth and posts) or Javascript client side (automating FileReader). I'll review and try to add content so these aren't just links.
If you can leave a page open for about 20 minutes due to "technical limitations", the dropbox should be loadable 100-at-a-time like that, assuming each upload takes less than 2 seconds; it's an easy hook to add a progress indicator.
If you're preloading the dropbox once yourself or the initial load is compatible with manual action, perhaps mapping a drive and trying to unzip an archive of your links to it would work. If your list of links isn't extremely volatile then the REST API could be used to synchronize changes.
Edit: Forgot to include this page on CloudConvert, which unzips archives containing up to 100 files into DropBox. Your use case doesn't seem to include retrieving the actual content at your servers (generated zip files), sending the automation list to the browser and then having the browser extract to dropbox, but it's another option.
The Dropbox API now offers the ability to save a file into Dropbox directly via a URL. There's a blog post about it here:
https://blogs.dropbox.com/developers/2015/06/programmatically-saving-a-url-to-dropbox/
The documentation can be found here:
https://www.dropbox.com/developers/core/docs#save-url
I'm in the process of developing a Google apps migration/archive system and at this point in development I'm trying to come up with a way to download all messages in all the groups that my domain users have created. I know that I can set up forwarding filters and have all messages archived to an email, but this doesn't help with older messages.
Is there a way to download these messages from a Google group and if so, is there away in the admin API to get a list of all groups that users have created?
If you don't mind using #bash, you may try a tool I wrote
https://github.com/icy/google-group-crawler
It can download all mbox files from Google Group. If you have a cookie file, you can even download all files from a private Google Group, and/or to see all original emails. It can also read rss feeds and fetch the latest posts ; and this is useful for daily mirror.
An example result is here http://l.archlinuxvn.org/archlinuxvn/. MHonArch is used to convert mbox files into HTML format.
Ultimately I ended up using the gdata python library to get a list of all groups along with their respective URLs. From there I used selenium to scrape the groups for messages and all replies. Probably not the best solution but it works for what I need.
I made a simple scrap utility by using selenium and htmlunit..
you can use it.. it is not very optimized and can help you download messages of small groups only(up-to 7000 msgs)
https://github.com/himukr/google-grp-scraper
I would like to use the GSA API to read and write the Dynamic Navigation settings. From the API documentation, it looks like this isn't possible. Screen scraping is another option, but most of the page content is built client-side with JavaScript, which makes normal command-line screen scraping very difficult.
Is there an undocumented feature of the API perhaps, or some other way to access settings that aren't covered in the API?
The unofficial gsa-admin-toolkit project has a gsa_admin.py script which can import and export configuration files, and, most importantly, re-sign them. Before, editing an exported config file wasn't an option because they contain a checksum or hash. They get corrupted when edited.
This tool allows editing of the config file, re-signing, and importing back to the GSA.