Removing header and footer in Word inserts Paragraph character - office-interop

I'm trying to remove all footers in my simple .doc document using interop. The code
foreach (Section sect in oDoc.Sections)
{
foreach (HeaderFooter headFoot in sect.Footers)
{ headFoot.Range.Delete(); }
}
inserst in every footer new paragraph character.Sometimes it transfers text into next page to new page. my investigation tells me that paragraph is inserting on accessing headFoot.Range object... So headFoot.Range.Text = String.Empty takes no effect too.
I also tried
try { oDoc.StoryRanges.Item(WdStoryType.wdEvenPagesFooterStory).Delete(); }
catch { }
try { oDoc.StoryRanges.Item(WdStoryType.wdFirstPageFooterStory).Delete(); }
catch { }
try { oDoc.StoryRanges.Item(WdStoryType.wdPrimaryFooterStory).Delete(); }
catch { }
, but this code does not remove footers. So is there any ideas how can avoid Paragraph character insertion while removing footers?

Related

GraphQL fragments for wp-graphql Gutenberg Blocks

If I have a query that starts like this:
export const pageQuery = graphql`
{
homepage: wordpress {
pages(where: { title: "Homepage" }) {
nodes {
isFrontPage
blocks {
name
... on WORDPRESS_CoreHeadingBlock {
name
attributes {
align
content
level
}
}
... on WORDPRESS_CoreParagraphBlock {
parentId
name
attributes {
... on WORDPRESS_CoreParagraphBlockAttributesV3 {
content
align
}
}
}
whereby I could have dozens of different Gutenberg blocks coming through, and then repeating them as InnerBlocks, what is the right way to break this into fragments or better organize it?
I'm running into a situation where my query ends up hundreds of lines long to account for nested blocks.
I'm working on a WP Gutenberg block parser for Gatsby and don't quite know the right way to approach these block queries.
Thanks!

Issues with making string accessable to rest of program

String[] input;
String output;
void setup() {
selectInput("Select a file to process:", "fileSelected")
println("########");
}
void draw() {
println(output);
}
void fileSelected(File selection) {
if(selection == null) {
println("Window was closed or the user hit 'cancel.'");
} else {
String filepath=selection.getAbsolutePath();
input=loadStrings(filepath);
println(input);
input.equals(output);
println(output);
}
}
I am working on a game project that needs to have a large matrix of integers loaded into a 2D array. I am using processing 3.4 and was using the selectInput() method as shown in the reference and loading the contents of a file into a string using loadStrings() like so.
I couldn't get this code to run because if I try to print the contents of 'input' I get the hated 'null pointer exception'. I don't know why this is, especially because the variable is a global variable. So I stated to use the 'output' variable to get around the null pointer issue. I print the output of input[] and output so that I can see that they have loaded, and I put the println(output); in draw() to see if I can access it. All I get is “null” (without quotes) printed to my console.
It appears that the output string is always empty. Even when I mades sure that is was declared as a “global level” variable, the variable is still null. I need the variable to be accessible on a public/global level so that the rest of the game code can convert the string into a matrix ( which I didn’t include here because it isn't important).
How can I load this string so that the rest of my code can use it?
The output string is always empty because you are not copying the input into it, the method equals doesn't work like that. I fixed your code and it works fine
String[] input;
String output;
void setup() {
selectInput("Select a file to process:", "fileSelected");
println("########");
}
void draw() {
if(output!=null)
println(output);
}
void fileSelected(File selection) {
if(selection == null)
{
println("Window was closed or the user hit 'cancel.'");
}
else {
String filepath=selection.getAbsolutePath();
input=loadStrings(filepath);
for(int i=0;i<input.length;i++)
output+=input[i]+"\n";
}
}

Read in big csv file, validate and write out using uniVocity parser

I need to parse a big csv file (2gb). The values have to be validated, the rows containing "bad" fields must be dropped and a new file containing only valid rows ought to be output.
I've selected uniVocity parser library to do that. Please help me to understand whether this library is well-suited for the task and what approach should be used.
Given the file size, what is the best way to organize read->validate->write in uniVocity ? Read in all rows at once or use iterator style ? Where parsed and validated rows should be stored before they are written to file ?
Is there a way in Univocity to access row's values by index ? Something like row.getValue(3) ?
I'm the author of this library, let me try to help you out:
First, do not try to read all rows at once as you will fill your memory with LOTS of data.
You can get the row values by index.
The faster approach to read/validate/write would be by using a RowProcessor that has a CsvWriter and decides when to write or skip a row. I think the following code will help you a bit:
Define the output:
private CsvWriter createCsvWriter(File output, String encoding){
CsvWriterSettings settings = new CsvWriterSettings();
//configure the writer ...
try {
return new CsvWriter(new OutputStreamWriter(new FileOutputStream(output), encoding), settings);
} catch (IOException e) {
throw new IllegalArgumentException("Error writing to " + output.getAbsolutePath(), e);
}
}
Redirect the input
//this creates a row processor for our parser. It validates each row and sends them to the csv writer.
private RowProcessor createRowProcessor(File output, String encoding){
final CsvWriter writer = createCsvWriter(output, encoding);
return new AbstractRowProcessor() {
#Override
public void rowProcessed(String[] row, ParsingContext context) {
if (shouldWriteRow(row)) {
writer.writeRow(row);
} else {
//skip row
}
}
private boolean shouldWriteRow(String[] row) {
//your validation here
return true;
}
#Override
public void processEnded(ParsingContext context) {
writer.close();
}
};
}
Configure the parser:
public void readAndWrite(File input, File output, String encoding) {
CsvParserSettings settings = new CsvParserSettings();
//configure the parser here
//tells the parser to send each row to them custom processor, which will validate and redirect all rows to the CsvWriter
settings.setRowProcessor(createRowProcessor(output, encoding));
CsvParser parser = new CsvParser(settings);
try {
parser.parse(new InputStreamReader(new FileInputStream(input), encoding));
} catch (IOException e) {
throw new IllegalStateException("Unable to open input file " + input.getAbsolutePath(), e);
}
}
For better performance you can also wrap the row processor in a ConcurrentRowProcessor.
settings.setRowProcessor(new ConcurrentRowProcessor(createRowProcessor(output, encoding)));
With this, the writing of rows will be performed in a separate thread.

DataTable RejectChanges does not reset all changes

I'm adding several rows to a DataTable in my strongly-typed DataSet and use a TableAdapterManager to insert the changes into my database. Using the UpdateAll function of the TableAdapterManager results in case of a failure in a database rollback of all inserted rows. Unfortunately DataTable.RejectChanges does not "rollback" the same rows in the DataTable.
In the call to DataTable.RejectChanges method only the last row is removed from the DataTable. I want the DataSet to have the same status as the database.
Isn't RejectChanges per MSDN documentation deleting all new (uncomitted) rows? Am I doing something wrong?
My code:
foreach (var item in List)
{
DataSet.customerRow custRow = ds.customer.NewcustomerRow();
custRow.name = item.Name;
try
{
ds.customer.AddcustomerRow(custRow);
}
catch (Exception ex)
{
ProcessException(ex, System.Reflection.MethodBase.GetCurrentMethod().Name);
valid=false;
}
}
if (valid)
{
DataSetTableAdapters.TableAdapterManager adapterManager = new DataSetTableAdapters.TableAdapterManager();
adapterManager.customerTableAdapter = new DataSetTableAdapters.customerTableAdapter();
try
{
retryPolicy.ExecuteAction(() =>
{
adapterManager.UpdateAll(ds);
});
}
catch (Exception ex)
{
ds.customer.RejectChanges();
}
}
else
{
ds.customer.RejectChanges();
}
The solution is to set adapterManager.BackupDataSetBeforeUpdate = true; This creates an internal backup copy of the dataset which is "reused" in case of failures.
MSDN: Hierarchical Update Overview
"The backup copy is only in memory during the execution of the TableAdapterManager.UpdateAll method. Therefore, there is no programmatic access to this backup dataset because it either replaces the original dataset or goes out of scope as soon as the TableAdapterManager.UpdateAll method has finished running."

How to fill all forms within a page with htmlunit

Is it possible to fill all forms of a page with random data (and then click on all buttons) using htmlunit test? Does anyone know an example?
It is possible to do that:
HTMLPage page = /* Your Page */
List<HTMLForm> forms = page.getForms();
for(HTMLForm form : forms) {
List<HTMLElement> elms = form.getHtmlElementsByTagNames(Arrays.asList("input", "textarea"/*etc*/));
List<HTMLSubmitInput> submits = new List<HTMLSubmitInput>();
for(HTMLElement elm : elms) {
if(elm instanceof HTMLSubmitInput) {
submits.add((HTMLSubmitInput) elm);
} else if (elm instanceof HtmlTextInput || elm instanceof HtmlPasswordInput) {
((HTMLInput) elm).setValue("BLA"/* add your random text here */)
} // Add more input types if you want
}
for(HTMLSubmitInput submit: submits) {
submit.click(); /*Capture the page here*/
}
}
For more of those input types, you can check out the HtmlInput JavaDocs.

Resources