How to convert text label to integer? - swift2

What is the right way to convert a text label to an integer in Swift 2? The following keeps throwing an error Fix-It Insert ";".
let deptid = (Int)myCell?.deptid.text

You cannot cast an optional String to Int. You should use the Int initializer instead, like this
Here's the code
if let text = myCell.deptid?.text, num = Int(text) {
print(num)
}

Related

how to remove extra spaces in Power Query

I want to remove extra spaces from the text, i referenced the code from the internet as below:
(text as text)=>
let
x = Text.Split(text," "),
y = Text.Select(x,each _<>""),
z = Text.Combine(y," ")
in
z
when i apply this function for my data , it show the error is "Expression.Error: We cannot convert a value of type List to type Text." , my column is definitely is text format already , i don't know root of the issue, could you please help look ?
my data is very simple, like below:
You can use below code as a custom function:
(text as text, optional char_to_trim as text) =>
let
char = if char_to_trim = null then " " else char_to_trim,
split = Text.Split(text, char),
removeblanks = List.Select(split, each _ <> ""),
result=Text.Combine(removeblanks, char)
in
result

Cannot cast object '(10)' with class 'java.lang.String' to class 'java.lang.Integer'

I am using ireport 3.7.1. I have made a connection with my database.I have a procedure which when given an input in number ,it returns the word format of the number i.e if I give input 10,it will return ten. The problem is when I am executing the procedure in pl/sql developer,I am getting the proper output but when I am firing the same procedure in ireport it's giving me this exception
Cannot cast object '(10)' with class 'java.lang.String' to class 'java.lang.Integer' .
Casting straight from a String to an Integer is not possible. You'll want to use the function Integer.parseInt(stringNumber);
(10) isn't a properly formated integer. Not even for PL/SQL:
select '(10)' +0 from dual;
> ORA-01722: invalid number
I could only suggest you to trace back the point where those ( ) come from. And fix your code at that position instead. Just a wild guess, some number formats use parenthesis to represent negative numbers. Maybe this is your case?
That being said, if you still want to locally remove the parenthesis that have somehow lurked inside of your string:
String str = "(10)";
int value = Integer.parseInt(str.substring(1, str.length()-1));
// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
// *blindly* get away of first and last character
// assuming those are `(` and `)`
For something a little bit more robust, and assuming parenthesis denotes negative numbers, you should try some regex:
String str = "(10)";
str = str.replaceFirst("\\(([0-9]+)\\)", "-$1");
// ^^^ ^^^ ^
// replace integer between parenthesis by its negative value
// i.e.: "(10)" become "-10" (as a *string*)
int value = Integer.parseInt(str);

How to print/scan Underscores in Numeric Literals

Team,
I am not able to use the Java 7 Underscores in Numeric Literals feature for getting the input from user and printing out in same format as declared. Please help in doing that? OR Is this feature is incomplete?
Scanner input = new Scanner( System.in );
int x = 1_00_000;
System.out.print( "Enter numeric literals with underscores: " ); //2_00_000
x = input.nextInt(); //java.util.InputMismatchException
System.out.println(x); // Prints in normal format, but want to be in 2_00_000.
NOTE: In Eclipse; I am able to change the value of numeric literal with Underscored numeric literal in runtime. This may be hack, but this is needed feature to input Underscored numeric literal in runtime rit?.
http://www.eclipse.org/jdt/ui/r3_8/Java7news/whats-new-java-7.html#miscellaneous
if you want maintain the underscores you can use String:
Scanner input = new Scanner( System.in );
System.out.print( "Enter numeric literals with underscores: " ); //2_00_000
String stringLiterals = input.nextLine();
System.out.println(stringLiterals); // Prints 2_00_000.

Converting Characters to ASCII Code & Vice Versa In C++/CLI

I am currently learning c++/cli and I want to convert a character to its ASCII code decimal and vice versa( example 'A' = 65 ).
In JAVA, this can be achieved by a simple type casting:
char ascci = 'A';
char retrieveASCII =' ';
int decimalValue;
decimalValue = (int)ascci;
retrieveASCII = (char)decimalValue;
Apparently this method does not work in c++/cli, here is my code:
String^ words = "ABCDEFG";
String^ getChars;
String^ retrieveASCII;
int decimalValue;
getChars = words->Substring(0, 1);
decimalValue = Int32:: Parse(getChars);
retrieveASCII = decimalValue.ToString();
I am getting this error:
A first chance exception of type 'System.ArgumentOutOfRangeException' occurred in mscorlib.dll
Additional information: Input string was not in a correct format.
Any Idea on how to solve this problem?
Characters in a TextBox::Text property are in a System::String type. Therefore, they are Unicode characters. By design, the Unicode character set includes all of the ASCII characters. So, if the string only has those characters, you can convert to an ASCII encoding without losing any of them. Otherwise, you'd have to have a strategy of omitting or substituting characters or throwing an exception.
The ASCII character set has one encoding in current use. It represents all of its characters in one byte each.
// using ::System::Text;
const auto asciiBytes = Encoding::ASCII->GetBytes(words->Substring(0,1));
const auto decimalValue = asciiBytes[0]; // the length is 1 as explained above
const auto retrieveASCII = Encoding::ASCII->GetString(asciiBytes);
Decimal is, of course, a representation of a number. I don't see where you are using decimal except in your explanation. If you did want to use it in code, it could be like this:
const auto explanation = "The encoding (in decimal) "
+ "for the first character in ASCII is "
+ decimalValue;
Note the use of auto. I have omitted the types of the variables because the compiler can figure them out. It allows the code to be more focused on concepts rather than boilerplate. Also, I used const because I don't believe the value of "variables" should be varied. Neither of these is required.
BTW- All of this applies to Java, too. If your Java code works, it is just out of coincidence. If it had been written properly, it would have been easy to translate to .NET. Java's String and Charset classes have very similar functionality as .NET String and Encoding classes. (Encoding to the proper term, though.) They both use the Unicode character set and UTF-16 encoding for strings.
More like Java than you think
String^ words = "ABCDEFG";
Char first = words [0];
String^ retrieveASCII;
int decimalValue = ( int)first;
retrieveASCII = decimalValue.ToString();

Real Studio: How do I get the index of a substring?

How can I get the index of a substring in from a string using Real Studio?
For example, I want to get the index of World in the example bellow
Dim str As String = "Hello World"
Look at the Instr() method in http://docs.realsoftware.com/index.php/Instr
Dim pos As Integer
pos = str.InStr("World")

Resources