Problem with getting list of directories : SMBLibrary - smb

I'm trying to get the list of shared folders from SMB server using following code but I'm getting "STATUS_USER_SESSION_DELETED" from ListShares() :
var client = new SMB2Client();
var success = client.Connect(System.Net.IPAddress.Parse("192.21.1.40"),
SMBTransportType.DirectTCPTransport);
// Success
if (success)
{
var status = client.Login(String.Empty, "user", "pass");
// Success
if (status == NTStatus.STATUS_SUCCESS)
{
var shares = client.ListShares(out var actionStatus);
// **FAILURE : SMBLibrary.NTStatus.STATUS_USER_SESSION_DELETED**
foreach (var item in shares)
{
Console.WriteLine(item);
}
}
}
I'm using following library for SMB communication.
https://github.com/TalAloni/SMBLibrary
I'm using C#.NET for development.
Thanks for the help in advance.

The OP has submitted a packet capture that demonstrated that the problem was with an outdated SAMBA server on the other end. (https://github.com/TalAloni/SMBLibrary/issues/24)

Related

Firefox extension proxy

I am trying to create a Firefox extension to block search terms on school computers. I'd like to prohibit a list of keywords, but the blocking doesn't seem to be working.
I found an example through a plugin gallery here:
https://github.com/mdn/webextensions-examples/blob/master/proxy-blocker/background/proxy-handler.js
This plugin listens to blocked hosts, and then basically returns localhost. I'd like to do the same, but when search terms are added in. I used the code in the example above as a starting point.
Here is the code I have so far:
// Initialize the list of blocked hosts
let blockedHosts = ["www.duckduckgo.com", "www.google.com"];
let blockedTerms = ["games", "minecraft", "legos"];
// Set the default list on installation.
browser.runtime.onInstalled.addListener(details => {
browser.storage.local.set({
blockedHosts: blockedHosts
});
});
// Get the stored list
browser.storage.local.get(data => {
if (data.blockedHosts) {
blockedHosts = data.blockedHosts;
}
});
// Listen for changes in the blocked list
browser.storage.onChanged.addListener(changeData => {
blockedHosts = changeData.blockedHosts.newValue;
});
// Managed the proxy
// Listen for a request to open a webpage
browser.proxy.onRequest.addListener(handleProxyRequest, {urls: ["<all_urls>"]});
function handleProxyRequest(requestInfo) {
let urlToCheck = new URL(requestInfo.documentUrl)
let searchString = urlToCheck.search;
const url = new URL(requestInfo.url);
let found;
blockedTerms.map((term) =>{
if(searchString.search(term) != -1){
found = true
}
})
if ( blockedHosts.indexOf(url.hostname) != -1 & found) {
return {type: "https", host: "127.0.0.1", port: 65535};
}
// Return instructions to open the requested webpage
return {type: "direct"};
}
// Log any errors from the proxy script
browser.proxy.onError.addListener(error => {
console.error(`Proxy error: ${error.message}`);
});
The URL that the browser creates is https://duckduckgo.com/?t=ffab&q=games&ia=web for example. I can determine that the term "games" was found, and that it was found in a duck duck go search, but the proxy wont work and the browser wont stop the user from going to the page.
Any help would be appreciated!
To start with, in a school environment, I suppose they have to use school net connection. It would be a lot easier to block at the main internet connection instead of creating and installing an addon on each computer (that might be altered or bypassed with another browser).
However, to answer your question, the following would be one (simpler) way of doing that using webRequest.onBeforeRequest:
// add a listener for web requests
browser.webRequest.onBeforeRequest.addListener(process, {
urls: ['*://*/*']
},
['blocking']
);
function process(e) {
// e.url is the target url
// no need for storage as the filter-list is hard-coded
const blockedHosts = ['www.duckduckgo.com', 'www.google.com'];
const blockedTerms = ['games', 'minecraft', 'legos'];
const hostRegExp = new RegExp(`^https?://(${blockedHosts.join('|')})/`, 'i');
const termRegExp = new RegExp(`(${blockedTerms.join('|')})`, 'i');
// if matches above criteria, redirect to 127.0.0.1
if (hostRegExp.test(e.url) && termRegExp.test(e.url)) {
return {redirectUrl: 'https://127.0.0.1:65535/'};
}
}

How to update an User with useMasterKey in Parse

Issue Description
I'm trying to update an User when another user click on my Xamarin button.
Then, I used Cloud Code to perform this but it doesnt work
My Code
Here is my complete JS code :
Parse.Cloud.beforeSave("Archive", function(request, response) {
Parse.serverURL = 'https://pg-app-0brffxkawi8lqvf2eyc2isqrs66zsu.scalabl.cloud/1/';
var status = request.object.get("status");
if (status == "validated") {
var event = request.object.get("event");
event.fetch({
success: function(myEvent) {
var coinsEvent = myEvent.get("coins");
var user = request.object.get("user");
user.fetch({
success: function(myUser, coinsEvent, user) {
var email = myUser.get("email");
var coinsUser = myUser.get("coins");
myUser.set("coins", coinsUser + coinsEvent);
return myUser.save(null, {useMasterKey:true});
}
});
}
});
}
response.success();
});
I think myUser.save(null, {useMasterKey:true}); should work
I actually have that error :
Dec 24, 2017, 12:27 GMT+1 - ERRORError generating response for [PUT] /1/classes/_User/1GPcqmn6Hd
"Cannot modify user 1GPcqmn6Hd."
{
"coins": 250
}
Environment Setup
Server
parse-server version : v2.3.3
Server: Sashido
Your success branch never calls response.success() which is a problem... though maybe not THE problem.
You are also doing 2 fetches inside a 'beforeSave' function which is not recommended. 'BeforeSave' must happen very quickly and fetches take time. I would consider thinking through other options.
If you really need to do it this way, consider doing a Parse.Query("event") with an include("user") and trigger the query with query.first({useMasterKey:true}).
Are you sure coinsEvent is what you think it is? Fetch only returns the object fetched... not sure that you can curry in other parameters. I would change your final success routine to (double checking that coinsEvent is valid):
success: function(myUser) {
var coinsUser = myUser.get("coins");
myUser.set("coins", coinsUser + coinsEvent);
return myUser.save(null, {useMasterKey:true}).then(_ => response.success());
}

Parse.com background job fail the service is currently unavailable

I am using Cloud Code to update all users, everyday. It used to work, but now getting error after 5 minute processing. "the service is currently unavailable" without any reason. I have checked status.parse.com and there is no relevant down. I have 10 000 users.
Parse.Cloud.job("makeUsersPassiveAndSendPushes", function(request, status) {
Parse.Cloud.useMasterKey();
var activeUsers = [];
var limitDoneUsers = [];
var nowDate=new Date();
var updatedUsers = [];
var query = new Parse.Query(Parse.User);
query.equalTo("passive",false);
query.each(function(user) {
if(user.get("passive") === false){
activeUsers.push(user);
user.set("passive", true);
user.set("passiveDate",nowDate);
}
if(user.get("isLimitDone")){
limitDoneUsers.push(user);
}
user.set("isLimitDone",false);
user.set("activeMatch",null);
user.set("canGetMatch",true);
user.set("dailyMatchEndCount",0);
//user.set("lastMatchLimit",false);
user.set("todaysMatches",[]);
updatedUsers.push(user);
return user.save();
})
Could you help me? Thanks.
You may want to try modifying the last line from:
return user.save();
to use callbacks for the save function, to ensure they are firing in sequence:
return user.save(null, {
success: function (user) {
return user;
},
error: function (error) {
return Parse.Promise.error(new Error("error"));
}
});
Another alternative would be to use the saveAll function like this:
return Parse.Object.saveAll(updatedUsers).then(function() {
//code that fires after all objects are saved
});
Also, are you using the hosted Parse.com environment or have you transitioned to another provider like Heroku & mLab?
As a fellow Parse user with this same issue (background job failing with this error when performing many inserts), I look forward to any comments you may have.

Offline data retrieval in a Cordova Windows project

I am having difficulties using the Kapsel OData plugin to retrieve data from a store when the device is offline.
Here is the situation :
Cordova application for Windows platform
When the app opens, I start by opening a store against my OData service (similar to the Northwind service)
The store is created and opened. I then try and retrieve data from the store using OData.read or by setting a model and then calling read() on it.
The store will successfully open. However, my call to read the data will succeed when the device is online, and fail when it is offline, no matter which of the two previous methods I use.
Here is my code :
function openStore() {
var properties = {
"name": "Emergency",
"host": applicationContext.registrationContext.serverHost,
"port": applicationContext.registrationContext.serverPort,
"https": applicationContext.registrationContext.https,
"serviceRoot": appId,
"definingRequests": {
"Products": "/Products"
}
};
store = sap.OData.createOfflineStore(properties);
store.open(openStoreSuccessCallback, errorCallback);
}
function openStoreSuccessCallback() {
sap.OData.applyHttpClient();
retrieveWithModel();//retrieveWithOData();
}
function retrieveWithModel() {
var uri = applicationContext.applicationEndpointURL;
var user = applicationContext.registrationContext.user;
var password = applicationContext.registrationContext.password;
var headers = { "X-SMP-APPCID": applicationContext.applicationConnectionId };
var oModel = new sap.ui.model.odata.ODataModel(uri, {
json: "true",
user: user,
password: password,
headers: headers
});
sap.ui.getCore().setModel(oModel);
oModel.read("/Products", {
success: function (oEvent) {
var msg = new Windows.UI.Popups.MessageDialog("Success");
msg.showAsync();
},
error: function (err) {
console.log("you have failed");
var msg = new Windows.UI.Popups.MessageDialog("Fail");
msg.showAsync();
}
});
}
function retrieveWithOData() {
var sURL = applicationContext.applicationEndpointURL + "/Products";
var oHeaders = {};
oHeaders['Authorization'] = authStr;
oHeaders['X-SMP-APPCID'] = applicationContext.applicationConnectionId;
//oHeaders['Content-Type'] = "application/json";
//oHeaders['X-CSRF-Token'] = "FETCH";
var request = {
headers: oHeaders,
requestUri: sURL,
method: "GET"
};
OData.read(request,
function (data, response) {
console.log('Success');
},
function (err) {
console.log('Fail');
}
);
}
Kapsel SDK version is 3.8.0
SMP SDK is SP08
Cordova version 5.3.3
I am wondering if this is an issue with the way the app is launched. I need a way to open the same instance of the application every time, so the offline store will be kept with all its data. Because Cordova-generated Visual Studio projects do not generate an .exe file (only .appx files which would need to be signed and sideloaded to be used), the way I proceed is : I run the application in online mode from Visual Studio, then pin it to the taskbar or start menu, close it and switch the device to offline mode, and reopen it from the task bar.
However, more and more it seems like this method does not work as expected.
Can anyone confirm that a Visual Studio project, opened from the taskbar, should run the same way as when it is run from VS, with the same dependencies, libraries etc? If such is the case (and I can't really imagine why it wouldn't be), does anyone have any experience with these technologies and see what a potential issue could be?
Any help would be greatly appreciated.
Thanks!
Ok I found the solution to my problem. In case anyone ever encounters the same issue, the problem was that my offline store was not being used (you can see with Fiddler that there are outbound requests to the backend system even in offline mode).
The Visual studio project does keep the store from one build or launch to the next.

a file upload progress bar with node (socket.io and formidable) and ajax

I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js

Resources