I'm trying to write a Visual Studio Package that will attach the debugger to a named process.
I am using the following code in my package.
var info = new VsDebugTargetInfo
{
dlo = DEBUG_LAUNCH_OPERATION.DLO_AlreadyRunning,
bstrExe = strProcessName,
bstrCurDir = "c:\\",
bstrArg = "",
bstrEnv = "",
bstrOptions = null,
bstrPortName = null,
bstrMdmRegisteredName = null,
bstrRemoteMachine = "",
cbSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf<VsDebugTargetInfo>(),
grfLaunch = (uint)(__VSDBGLAUNCHFLAGS.DBGLAUNCH_DetachOnStop| __VSDBGLAUNCHFLAGS.DBGLAUNCH_StopDebuggingOnEnd| __VSDBGLAUNCHFLAGS.DBGLAUNCH_WaitForAttachComplete),
fSendStdoutToOutputWindow = 1,
clsidCustom = VSConstants.CLSID_ComPlusOnlyDebugEngine
};
VsShellUtilities.LaunchDebugger(ServiceProvider, info);
However I get the following, unhelpful, error:
Exception : Unable to attach. Operation not supported. Unknown error: 0x80070057.
The code is obviously doing something because if the process has not started I get this error
Exception : Unable to attach. Process 'xxxxxxxx' is not running on 'xxxxxxxx'.
The process is a managed .net 4 process and I am able to attach to it through the VS UI.
For context I am trying to replace a simple Macro I was using in VS 2010 to do the same job but that obviously can't be run in newer versions of Visual Studio.
I found a totally different piece of code, inspited by https://github.com/whut/AttachTo, worked much better to achieve the same result
foreach (Process process in (DTE)GetService(typeof(DTE)).Debugger.LocalProcesses)
if (process.Name.EndsWith(strProcessName,StringComparison.InvariantCultureIgnoreCase))
process.Attach();
I had to use 'ends with' because the process names include the full path to the running exe.
Related
I've a brand new install of Geoserver 2.22 on Ubuntu 22.04 and installation was smooth. I've added the official NetCDF plugin by unzipping the contents to the WEB-INF/lib/ folder, and it shows up as a type in the data store. Great!
I have a selection of NEtCDFs that can be loaded successfully elsewhere (QGIS, ArcGIS Pro, Python via Xarray), however, when I attempt to create a new data store, choose NetCDF and select the .nc files, I get the following error message:
There was an error trying to connect to store AFDRS_FSE_curing. Do you want to save it anyway?
Original exception error:
Failed to create reader from file:efs/temp_surface.nc and hints Hints: FORCE_LONGITUDE_FIRST_AXIS_ORDER = true EXECUTOR_SERVICE = java.util.concurrent.ThreadPoolExecutor#1242674b[Running, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0] FILTER_FACTORY = FilterFactoryImpl STYLE_FACTORY = StyleFactoryImpl FEATURE_FACTORY = org.geotools.feature.LenientFeatureFactoryImpl#6bfaa0a6 FORCE_AXIS_ORDER_HONORING = http GRID_COVERAGE_FACTORY = GridCoverageFactory TILE_ENCODING = null REPOSITORY = org.geoserver.catalog.CatalogRepository#3ef5cfc5 LENIENT_DATUM_SHIFT = true COMPARISON_TOLERANCE = 1.0E-8
What am I missing here? That error message doesn't seem to be highlighting somethign obviously as an error...
The NetCDFs are located in: /usr/share/geoserver/data_dir/data
Using Octopus Deploy to deploy a simple API.
The first step of our deployment process is to generate an HTML report with the delta of the scripts run vs the scripts required to run. I used this tutorial to create the step.
The relevant code in my console application is:
var reportLocationSection = appConfiguration.GetSection(previewReportCmdLineFlag);
if (reportLocationSection.Value is not null)
{
// Generate a preview file so Octopus Deploy can generate an artifact for approvals
try
{
var report = reportLocationSection.Value;
var fullReportPath = Path.Combine(report, deltaReportName);
Console.WriteLine($"Generating upgrade report at {fullReportPath}");
upgrader.GenerateUpgradeHtmlReport(fullReportPath);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
return operationError;
}
}
The Powershell which I am using in the script step is:
# Get the extracted path for the package
$packagePath = $OctopusParameters["Octopus.Action.Package[DatabaseUpdater].ExtractedPath"]
$connectionString = $OctopusParameters["Project.Database.ConnectionString"]
$reportPath = $OctopusParameters["Project.HtmlReport.Location"]
Write-Host "Report Path: $($reportPath)"
$exeToRun = "$($packagePath)\DatabaseUpdater.exe"
$generatedReport = "$($reportPath)\UpgradeReport.html"
Write-Host "Generated Report: $($generatedReport)"
if ((test-path $reportPath) -eq $false){
New-Item "Creating new directory..."
} else {
New-Item "Directory already exists."
}
# Run this .NET app, passing in the Connection String and a flag
# which tells the app to create a report, but not update the database
& $exeToRun --connectionString="$($connectionString)" --previewReportPath="$($reportPath)"
New-OctopusArtifact -Path "$($generatedReport)"
The error reported by Octopus is:
'Could not find file 'C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html'.'
I'm guessing that is being thrown when this powershell line is hit: New-OctopusArtifact ...
And that seems to indicate that the report was never created.
I've used a bit of logging to log out certain variables and the values look sound:
Report Path: C:\DeltaReports\Some API\2.9.15-DbUp-Test-9
Generated Report: C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html
Generating upgrade report at C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html
As you can see in the C#, the relevant code is wrapped in a try/catch block, but I'm not sure whether the error is being written out there or at a later point by Octopus (I'd need to do a pull request to add a marker in the code).
Can anyone see a way forward win resolving this? Has anyone else encountered this?
Cheers
I recently redid some of the work from that article for this video up on YouTube. I did run into some issues with the .SQL files not being included in the assembly. I think it was after I upgraded to .NET 6. But that might be a coincidence.
Anyway, because the files weren't being included in the assembly, when I ran the command line app via Octopus, it wouldn't properly generate the file for me. I ended up configuring the project to copy the .SQL files to a folder in the output directory instead of embedding them in the assembly. You can view a sample package here.
One thing that helped me is running the app in a debugger with the same parameters just to make sure it was actually generating the file. I'm sure you already thought of that, but I'd be remiss if I forgot to include it in my answer. :)
FWIW, this is my updated scripts.
First, the Octopus Script:
$packagePath = $OctopusParameters["Octopus.Action.Package[Trident.Database].ExtractedPath"]
$connectionString = $OctopusParameters["Project.Connection.String"]
$environmentName = $OctopusParameters["Octopus.Environment.Name"]
$reportPath = $OctopusParameters["Project.Database.Report.Path"]
cd $packagePath
$appToRun = ".\Octopus.Trident.Database.DbUp"
$generatedReport = "$reportPath\UpgradeReport.html"
& $appToRun --ConnectionString="$connectionString" --PreviewReportPath="$reportPath"
New-OctopusArtifact -Path "$generatedReport" -Name "$environmentName.UpgradeReport.html"
My C# code can be found here but for ease of use, you can see it all here (I'm not proud of how I parse the parameters).
static void Main(string[] args)
{
var connectionString = args.FirstOrDefault(x => x.StartsWith("--ConnectionString", StringComparison.OrdinalIgnoreCase));
connectionString = connectionString.Substring(connectionString.IndexOf("=") + 1).Replace(#"""", string.Empty);
var executingPath = Assembly.GetExecutingAssembly().Location.Replace("Octopus.Trident.Database.DbUp", "").Replace(".dll", "").Replace(".exe", "");
Console.WriteLine($"The execution location is {executingPath}");
var deploymentScriptPath = Path.Combine(executingPath, "DeploymentScripts");
Console.WriteLine($"The deployment script path is located at {deploymentScriptPath}");
var postDeploymentScriptsPath = Path.Combine(executingPath, "PostDeploymentScripts");
Console.WriteLine($"The deployment script path is located at {postDeploymentScriptsPath}");
var upgradeEngineBuilder = DeployChanges.To
.SqlDatabase(connectionString, null)
.WithScriptsFromFileSystem(deploymentScriptPath, new SqlScriptOptions { ScriptType = ScriptType.RunOnce, RunGroupOrder = 1 })
.WithScriptsFromFileSystem(postDeploymentScriptsPath, new SqlScriptOptions { ScriptType = ScriptType.RunAlways, RunGroupOrder = 2 })
.WithTransactionPerScript()
.LogToConsole();
var upgrader = upgradeEngineBuilder.Build();
Console.WriteLine("Is upgrade required: " + upgrader.IsUpgradeRequired());
if (args.Any(a => a.StartsWith("--PreviewReportPath", StringComparison.InvariantCultureIgnoreCase)))
{
// Generate a preview file so Octopus Deploy can generate an artifact for approvals
var report = args.FirstOrDefault(x => x.StartsWith("--PreviewReportPath", StringComparison.OrdinalIgnoreCase));
report = report.Substring(report.IndexOf("=") + 1).Replace(#"""", string.Empty);
if (Directory.Exists(report) == false)
{
Directory.CreateDirectory(report);
}
var fullReportPath = Path.Combine(report, "UpgradeReport.html");
if (File.Exists(fullReportPath) == true)
{
File.Delete(fullReportPath);
}
Console.WriteLine($"Generating the report at {fullReportPath}");
upgrader.GenerateUpgradeHtmlReport(fullReportPath);
}
else
{
var result = upgrader.PerformUpgrade();
// Display the result
if (result.Successful)
{
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Success!");
}
else
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(result.Error);
Console.WriteLine("Failed!");
}
}
}
I hope that helps!
After long and detailed investigation, we discovered the answer was quite obvious.
We assumed the existing deploy process configuration was sound. Because we never had a problem with it (until now). As it transpires, there was a problem which led to the Development deployments being deployed twice.
Hence, the errors like the one above and others which talked about file handles being held by another process.
It was actually obvious in hindsight, but we were blind to it as we thought the existing process was sound 😣
Came across an interesting problem today that I have been having trouble figuring out and I wanted to test the waters to see if anyone know if this is possible. I'm currently in the process of setting up nvim-dap and for my situation I need to be able to run debuggers for multiple processes. Suppose I had a configuration that looked like this
dap.adapters.node2 = {
type = 'executable',
command = 'node',
args = {os.getenv('HOME') .. '/dev/microsoft/vscode-node-debug2/out/src/nodeDebug.js'},
}
dap.configurations.javascript = {
{
type = 'node2',
request = 'attach',
name = 'program 1',
port = 9229,
},
{
type = 'node2',
request = 'attach',
name = 'program 2',
port = 7000,
},
{
type = 'node2',
request = 'attach',
name = 'program 3',
port = 5035,
},
}
Then when I use :lua require'dap'.continue() I will get the option to listen to only one of those running processes. Does anyone know if it's possible to get a debugger going which is able to attach to all of these processes in the same nvim session? Bonus points if it can attach to a chrome process as well for frontend debugging!
It looks like with launch configurations in VS code you could use something like a compound launch compound launch configuration , however, I couldn't find any resource for getting similar functionality with attaching and in nvim-dap.
I'd love to see how you all are solving a similar problem!
I have been trying to uninstall applications on devices or users using SCCM. I have been successful in creating an application assignment that would install applications, but I haven't been able to get it to uninstall. The code I have been using is:
IResultObject assignment = this.manager.CreateInstance("SMS_ApplicationAssignment");
IResultObject application =
this.manager.GetInstance("SMS_Application.CI_ID=16777339");
assignment["ApplicationName"].StringValue = application["LocalizedDisplayName"].StringValue;
assignment["AssignedCI_UniqueID"].StringValue = application["CI_UniqueID"].StringValue;
assignment["AssignedCIs"].IntegerArrayValue = new[] { application["CI_ID"].IntegerValue};
assignment["AssignmentName"].StringValue = "Deepak's deployment";
assignment["CollectionName"].StringValue = "Deepak's Collection of Devices";
assignment["DisableMomAlerts"].BooleanValue = true;
assignment["NotifyUser"].BooleanValue = false;
assignment["OfferFlags"].IntegerValue = 0;
assignment["DesiredConfigType"].IntegerValue = 1;
assignment["OverrideServiceWindows"].BooleanValue = false;
assignment["RebootOutsideOfServiceWindows"].BooleanValue = false;
assignment["SuppressReboot"].IntegerValue = 0;
assignment["TargetCollectionID"].StringValue = "UKN0000F";
assignment["EnforcementDeadline"].DateTimeValue = DateTime.Now.AddDays(1);
assignment["StartTime"].DateTimeValue = DateTime.Now;
assignment["UseGMTTimes"].BooleanValue = false;
assignment["UserUIExperience"].BooleanValue = false;
assignment["WoLEnabled"].BooleanValue = false;
assignment["RequireApproval"].BooleanValue = true;
assignment["OfferTypeId"].IntegerValue = 2;
assignment.Put();
This code will put up the application as an install deployment in SCCM. How do I get it as an uninstall deployment?
There is an AppAction enumeration, which I suspect is used by the client and not on the server.
typedef enum AppAction
{
appDiscovery = 0,
appInstall = 1,
appUninstall = 2
} AppAction;
Any help would be appreciated!
The setting that needs to be changed is DesiredConfigType.
For your code add the following before put():
assignment["DesiredConfigType"].IntegerValue = 2;
A value of 1 represents install (required) and 2 will uninstall (not allowed).
https://msdn.microsoft.com/en-us/library/hh949014.aspx
The way I do it is first use uninstall.exe to determine the guid of the program, and then create a program for the package I wish to uninstall, and just call uninstall.exe /whatever as the command. This works for most apps that show up in Add/Remove, and if it doesn't show up there then it'll have to be a hack (or script) anyway to uninstall. I believe the reason you're falling short is because if there is no command to uninstall the deployment in sccm, then it has nothing to run.
After you create an uninstall program, you could just call that deployment from your code, and voila.
As long as the target program that you are trying to use was installed via an MSI (Microsoft Installer) then you can loop through the registry to find your product (Registry Location: "HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Uninstall") And just look at each DisplayName value.
In our environment, I accomplish this task by using Powershell, and we setup a program that specifically uninstalls whatever we are after.
Hope this helps...
Raged.
This post has the info about how to run profiler as the following batch file
vsperfcmd /start:coverage /output:run.coverage
hello
vsperfcmd /shutdown
into C# code
// A guid is used to keep track of the run
Guid myrunguid = Guid.NewGuid();
Monitor m = new Monitor();
m.StartRunCoverage(myrunguid, "run.coverage");
// TODO: Launch some tests or something
// that can exercise myassembly.exe
// Complete the run
m.FinishRunCoverage(myrunguid);
For the TODO: part, I used this code
p = new Process();
p.StartInfo.FileName = "hello.exe";
p.Start();
p.WaitForExit();
// Look at return code – 0 for success
if (p.ExitCode != 0) {
Console.Error.WriteLine("Error in profiling");
System.Environment.Exit( -3 );
}
The code runs fine, but I can't the profiled result I did with running batch file.
This is the result from running batch file which has all the info.
This is the result from C# code which doesn't have profiled info, but only schema
What might be wrong?
I asked the same question to MSDN Forums, and it seems like the method doesn't work.