Insert into a local Sql Server CE don't insert - visual-studio-2010

Hi I have A local database named Database1.sdf.
IĀ“m accessing it with following code to insert some data into a table:
public string DoLocalDbCmd(string Command)
{
int NumeroAffetto;
string ConnString = #"Data Source=|DataDirectory|\Database1.sdf";
SqlCeConnection Conn = new SqlCeConnection(ConnString);
SqlCeCommand Comando = new SqlCeCommand(Command, Conn);
Comando.CommandType = System.Data.CommandType.Text;
try
{
Comando.Connection.Open();
NumeroAffetto = Comando.ExecuteNonQuery();
return NumeroAffetto.ToString();
}
catch (Exception ex)
{
return ex.Message;
}
finally
{
Comando.Connection.Close();
}
}
private void button1_Click(object sender, EventArgs e)
{
DoLocalDbCmd cmd = new DoLocalDbCmd();
string cmdex = cmd.RunSqlCmd("insert into TBL_PROVA (BELLO) VALUES ('VERO')");
MessageBox.Show(cmdex);
}
The execution of code happen without errors, I retrieve the number of affected row = 1.
But after if I query the database there is not inserted no row.
Somebody can suggest to me what can be wrong ?
Thankyou in advance
Piercarlo

It is a common scenario when these conditions are all or partially present in your project.
You have a connection string with the DataDirectory substitution
string.
You have a connection on the Server Explorer that point to an SDF
file located in your project directory.
You have the property Copy To Output Directory set to Copy Always
on your SDF file listed in your project items.
When you run the program inside the IDE of VS the SDF file present in your project folder is copied to the Output Directory (BIN\DEBUG) following the setting of Copy To Output Directory and this could result in the overwriting of the file eventually already present in BIN\DEBUG.
You run your code and insert correctly your data into the file in the BIN\DEBUG folder.
You stop your program and checks if the record is present using the Server Explorer and you don't see any new record because you are looking at the file in the Project Folder.
You start a new debug session and the file in the BIN\DEBUG is overwritten again with the empty one.
So... change the property Copy to the Output Directory to Copy Never to stop this copying, change the connection in the server explorer to point to your database in the BIN\DEBUG folder (Or add another one keeping the old one for schema changes and the new one to verify your DML operations)

Related

Problem Generating Html Report Using DbUp during Octopus Deployment

Using Octopus Deploy to deploy a simple API.
The first step of our deployment process is to generate an HTML report with the delta of the scripts run vs the scripts required to run. I used this tutorial to create the step.
The relevant code in my console application is:
var reportLocationSection = appConfiguration.GetSection(previewReportCmdLineFlag);
if (reportLocationSection.Value is not null)
{
// Generate a preview file so Octopus Deploy can generate an artifact for approvals
try
{
var report = reportLocationSection.Value;
var fullReportPath = Path.Combine(report, deltaReportName);
Console.WriteLine($"Generating upgrade report at {fullReportPath}");
upgrader.GenerateUpgradeHtmlReport(fullReportPath);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
return operationError;
}
}
The Powershell which I am using in the script step is:
# Get the extracted path for the package
$packagePath = $OctopusParameters["Octopus.Action.Package[DatabaseUpdater].ExtractedPath"]
$connectionString = $OctopusParameters["Project.Database.ConnectionString"]
$reportPath = $OctopusParameters["Project.HtmlReport.Location"]
Write-Host "Report Path: $($reportPath)"
$exeToRun = "$($packagePath)\DatabaseUpdater.exe"
$generatedReport = "$($reportPath)\UpgradeReport.html"
Write-Host "Generated Report: $($generatedReport)"
if ((test-path $reportPath) -eq $false){
New-Item "Creating new directory..."
} else {
New-Item "Directory already exists."
}
# Run this .NET app, passing in the Connection String and a flag
# which tells the app to create a report, but not update the database
& $exeToRun --connectionString="$($connectionString)" --previewReportPath="$($reportPath)"
New-OctopusArtifact -Path "$($generatedReport)"
The error reported by Octopus is:
'Could not find file 'C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html'.'
I'm guessing that is being thrown when this powershell line is hit: New-OctopusArtifact ...
And that seems to indicate that the report was never created.
I've used a bit of logging to log out certain variables and the values look sound:
Report Path: C:\DeltaReports\Some API\2.9.15-DbUp-Test-9
Generated Report: C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html
Generating upgrade report at C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html
As you can see in the C#, the relevant code is wrapped in a try/catch block, but I'm not sure whether the error is being written out there or at a later point by Octopus (I'd need to do a pull request to add a marker in the code).
Can anyone see a way forward win resolving this? Has anyone else encountered this?
Cheers
I recently redid some of the work from that article for this video up on YouTube. I did run into some issues with the .SQL files not being included in the assembly. I think it was after I upgraded to .NET 6. But that might be a coincidence.
Anyway, because the files weren't being included in the assembly, when I ran the command line app via Octopus, it wouldn't properly generate the file for me. I ended up configuring the project to copy the .SQL files to a folder in the output directory instead of embedding them in the assembly. You can view a sample package here.
One thing that helped me is running the app in a debugger with the same parameters just to make sure it was actually generating the file. I'm sure you already thought of that, but I'd be remiss if I forgot to include it in my answer. :)
FWIW, this is my updated scripts.
First, the Octopus Script:
$packagePath = $OctopusParameters["Octopus.Action.Package[Trident.Database].ExtractedPath"]
$connectionString = $OctopusParameters["Project.Connection.String"]
$environmentName = $OctopusParameters["Octopus.Environment.Name"]
$reportPath = $OctopusParameters["Project.Database.Report.Path"]
cd $packagePath
$appToRun = ".\Octopus.Trident.Database.DbUp"
$generatedReport = "$reportPath\UpgradeReport.html"
& $appToRun --ConnectionString="$connectionString" --PreviewReportPath="$reportPath"
New-OctopusArtifact -Path "$generatedReport" -Name "$environmentName.UpgradeReport.html"
My C# code can be found here but for ease of use, you can see it all here (I'm not proud of how I parse the parameters).
static void Main(string[] args)
{
var connectionString = args.FirstOrDefault(x => x.StartsWith("--ConnectionString", StringComparison.OrdinalIgnoreCase));
connectionString = connectionString.Substring(connectionString.IndexOf("=") + 1).Replace(#"""", string.Empty);
var executingPath = Assembly.GetExecutingAssembly().Location.Replace("Octopus.Trident.Database.DbUp", "").Replace(".dll", "").Replace(".exe", "");
Console.WriteLine($"The execution location is {executingPath}");
var deploymentScriptPath = Path.Combine(executingPath, "DeploymentScripts");
Console.WriteLine($"The deployment script path is located at {deploymentScriptPath}");
var postDeploymentScriptsPath = Path.Combine(executingPath, "PostDeploymentScripts");
Console.WriteLine($"The deployment script path is located at {postDeploymentScriptsPath}");
var upgradeEngineBuilder = DeployChanges.To
.SqlDatabase(connectionString, null)
.WithScriptsFromFileSystem(deploymentScriptPath, new SqlScriptOptions { ScriptType = ScriptType.RunOnce, RunGroupOrder = 1 })
.WithScriptsFromFileSystem(postDeploymentScriptsPath, new SqlScriptOptions { ScriptType = ScriptType.RunAlways, RunGroupOrder = 2 })
.WithTransactionPerScript()
.LogToConsole();
var upgrader = upgradeEngineBuilder.Build();
Console.WriteLine("Is upgrade required: " + upgrader.IsUpgradeRequired());
if (args.Any(a => a.StartsWith("--PreviewReportPath", StringComparison.InvariantCultureIgnoreCase)))
{
// Generate a preview file so Octopus Deploy can generate an artifact for approvals
var report = args.FirstOrDefault(x => x.StartsWith("--PreviewReportPath", StringComparison.OrdinalIgnoreCase));
report = report.Substring(report.IndexOf("=") + 1).Replace(#"""", string.Empty);
if (Directory.Exists(report) == false)
{
Directory.CreateDirectory(report);
}
var fullReportPath = Path.Combine(report, "UpgradeReport.html");
if (File.Exists(fullReportPath) == true)
{
File.Delete(fullReportPath);
}
Console.WriteLine($"Generating the report at {fullReportPath}");
upgrader.GenerateUpgradeHtmlReport(fullReportPath);
}
else
{
var result = upgrader.PerformUpgrade();
// Display the result
if (result.Successful)
{
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Success!");
}
else
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(result.Error);
Console.WriteLine("Failed!");
}
}
}
I hope that helps!
After long and detailed investigation, we discovered the answer was quite obvious.
We assumed the existing deploy process configuration was sound. Because we never had a problem with it (until now). As it transpires, there was a problem which led to the Development deployments being deployed twice.
Hence, the errors like the one above and others which talked about file handles being held by another process.
It was actually obvious in hindsight, but we were blind to it as we thought the existing process was sound šŸ˜£

FileOutputStream throw FileNotFoundException when get File with Japanese in path

When using Google Drive API, I'm having this downloadMetadataFile() here to handle file:
public void downloadMetadataFile(String fileId, String folderStorePath, String fileName) throws IOException, GeneralSecurityException, GoogleException {
String path = folderStorePath + "/" + fileName
java.io.File file = new java.io.File(path);
try (FileOutputStream fileOutputStream = new FileOutputStream(file)) {
Drive drive = createDrive();
drive.files().get(fileId)
.executeMediaAndDownloadTo(fileOutputStream);
}
}
When using above method with folder exists (izakayaTemplate + 居酒屋):
When path=/reports/template/izakayaTemplate/template3.png, the method working file and download template3.png successful from Google Drive
When path=/reports/template/居酒屋/template3.png, the method throw a FileNotFoundException at line try (FileOutputStream fileOutputStream = new FileOutputStream(file))
Can somebody please explain for me about this behavior?
Note:
I'm using SpringBoot 2.5, Java 8, Drive API v3
I'm running this project on Amazon linux 1 as a service by DaemonTool.
In the run config file, I have set
-Dfile.encoding=UTF-8
-Dsun.jnu.encoding=UTF-8 -Dfile.encoding=UTF-8 \
Update 1:
After debug for a while, I found out that the CanonicalPath is wrong for the new file I create but I don't know why it happen.
getPath: /reports/template/居酒屋/template3.png
getAbsolutePath: /reports/template/居酒屋/template3.png
getCanonicalPath: /reports/template/???/template3.png
After searching, I have found the solution for this problem:
Solution: Add export LANG=ja_JP.UTF-8 to the file run
Explanation: canonicalPath is the path file system considers the canonical means to reference the file system object to which it points. So in order for the system to get the canonicalPath to be correct, the environment must have set up correct language environment like in this document: https://docs.oracle.com/cd/E23824_01/html/E26033/glset.html. In my question, the correct language environment is ja_JP.UTF-8

Bulk loading with LoadIncrementalHFiles and subdirectories

I wrote a Spark application that generates HFiles to be used for bulk loading with the LoadIncrementalHFiles command later. As the source data pool is very big, the input files are splitted into iterations that are processed one after the other. Each iteration creates its own HFile directory, so my HDFS structure looks like this:
/user/myuser/map_data/hfiles_0
... /hfiles_1
... /hfiles_2
... /hfiles_3
...
There are about 500 files in this map_data directory, therefore I'm searching for a way to automatically call the LoadIncrementalHFiles function, to process these subdirectories also in iterations later.
The corresponding command would be this:
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles -Dcreate.table=no /user/myuser/map_data/hfiles_0 mytable
I need to change this into an iterative command, as this command does not work with subdirectories (when I call it with the /user/myuser/map_data directory)!
I tried to use a Java Process instance to execute the command above automatically, but this doesn't seen to do anything (no output to console and also no more rows in my HBase table).
Using the org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles Java class out of my code also doesn't work, it's also not responsing!
Has anybody a working example for me? Or is there a parameter to be able to run the above hbase command on the parent directory? I'm working with HBase 1.1.2 in a Hortonworks Data Platform 2.5 cluster.
EDIT I tried to run the LoadIncrementalHFiles command from a Hadoop client Java application, but I'm getting an exception relating to snappy compression, see Run LoadIncrementalHFiles from Java client
The solution was to split the hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles -Dcreate.table=no /user/myuser/map_data/hfiles_0 mytable command into many parts (one per command part), see this Java code snippet:
TreeSet<String> subDirs = getHFileDirectories(new Path(HDFS_PATH), hadoopConf);
for(String hFileDir : subDirs) {
try {
String pathToReadFrom = HDFS_OUTPUT_PATH + "/" + hFileDir;
==> String[] execCode = {"hbase", "org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles", "-Dcreate.table=no", pathToReadFrom, hbaseTableName};
ProcessBuilder pb = new ProcessBuilder(execCode);
pb.redirectErrorStream(true);
final Process p = pb.start();
// Write the output of the Process to the console
new Thread(new Runnable() {
public void run() {
BufferedReader input = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = null;
try {
while ((line = input.readLine()) != null)
System.out.println(line);
} catch (IOException e) {
e.printStackTrace();
}
}
}).start();
// Wait for the end of the execution
p.waitFor();
...
}

Deleting Files in Shared Google drive using Apple script

I have a shared folder and one of the editor usually add files into it. I want to keep flushing the folder by the following code by capturing its last change date. Its throwing error and it seems to me that as i am NOT the owner of the file, i cannot able to delete. Is there any way out?
function 7DayFlush()
{
// Log the files names and its last change info for the mentioned folder (by id)
// Enter the ID between Bracket
var mfolder = DriveApp.getFolderById('<i keep folder id here>');
// Following will get files from the folder.
var lfiles = mfolder.getFiles();
while (lfiles.hasNext()) {
var file = lfiles.next();
if (new Date() - file.getLastUpdated() > 7 * 24 * 60 * 60 * 1000) {
//Following will delete the files which matches the above condition which is older than 7days in the specified folder.
Logger.log(file.getName()+'----'+file.getLastUpdated());
//here is the error comes up.. help me.
file.setTrashed(true);
//here is the error comes up.. help me.
}
}
}

Error Appending to IsolatedStorageFile

I am having some problems with Isolated file store , I am trying to append to a file, but when I use the code below, I get an error about invalid Arguments on this line
IsolatedStorageFileStream("Folder\\barcodeinfo.txt", FileMode.Append,
FileMode.OpenOrCreate, myStore))
I think it has something to do with the Filemode.Append.. I am trying to append to the file rather than create new
// Obtain the virtual store for the application.
IsolatedStorageFile myStore = IsolatedStorageFile.GetUserStoreForApplication();
// Create a new folder and call it "MyFolder".
myStore.CreateDirectory("Folder");
// Specify the file path and options.
using (var isoFileStream = new IsolatedStorageFileStream("Folder\\barcodeinfo.txt", FileMode.Append, FileMode.OpenOrCreate, myStore))
{
//Write the data
using (var isoFileWriter = new StreamWriter(isoFileStream))
{
isoFileWriter.WriteLine(textBox1.Text);
isoFileWriter.WriteLine(textBox2.Text);
isoFileWriter.WriteLine(textBox3.Text);
}
}
There is no overload that takes two FileModes. It should be
IsolatedStorageFileStream("Folder\\barcodeinfo.txt", FileMode.Append,
FileAccess.Write, myStore));
Important thing to note about FileMode.Append is:
[FileMode.Append] Opens the file if it exists and seeks to the end of the file, or
creates a new file. Append can only be used in conjunction with Write.
Attempting to seek to a position before the end of the file will throw
an IOException and any attempt to read fails and throws an
NotSupportedException.
which is why FileAccess.Write is used.
It looks like you have FileMode.Append, FileMode.OpenOrCreate. That is 2 file modes. The first parameter is be FileMode and the second should be FileAccess.
That should fix your problem.

Resources