I am new to C++ and programming.
I am using the CLion editor. I created a simple program for a homework assignment and I can't figure out why my out put indents every line after the second line. I have searched online and here, but most indent questions ask how to indent, not how to make your output stop indenting when I never asked it to.
Thanks in advance for any help. I appreciate it.
I tried using code to left align, but that didn't work. I also tried creating a new project and retyping it in. I still got the same result.
I also tried adding a new line break--that prevents the indent, but then I have a blank line.
I think it may be a setting---but I have no idea which setting to change.
`
#include <iostream>
#include<iomanip>
using namespace std;
int main()
{
int numberPennies, numberNickels, numberDimes, numberQuarters,
totalCents;
cout << "Please Enter Number of Coins: " << endl;
cout << "# of Quarters: ";
cin >> numberQuarters;
cout << "# of Dimes: ";
cin >> numberDimes;
cout << "# of Nickels: ";
cin >> numberNickels;
cout << "# of Pennies: ";
cin >> numberPennies;
totalCents = int((numberQuarters * 25) + (numberDimes * 10) +
(numberNickels * 5) + (numberPennies));
cout << "The total is " << int(totalCents / 100) << " dollars and "
<< int(totalCents % 100) << " cents ";
return 0;}`
The result should be left aligned, instead my output appears like this:
`Please Enter Number of Coins:
# of Quarters:13
# of Dimes:4
# of Nickels:11
# of Pennies:17
The total is 4 dollars and 37 cents
Process finished with exit code 0`
It seems like you're doing everything right, I ran this code in Visual Studio 2019 and didn't have the issue you're describing. I think that the indentation you're seeing might be a feature of your IDE.
Try running the .exe you're generating instead of using the built in console in your IDE.
Related
(I don't have much english vocabulary, so sry for this weird try of english)
Hi guys! I'm new at C++ and I need to know how to create a filter code that help me at only accept int-eger numbers. I need that this code use only the 'iostream' library. This is because my teacher don't let us use another kind of library (we are new at C++ coding).
Here I put an example of what I have at this moment:
# include <iostream>
# include <limits> //I should't use this library
using namespace std;
int main() {
int value = 0;
cout << "Enter an integer value: ";
while(!(cin >> value)) {
cin.clear();
cin.ignore(numeric_limits<streamsize>::max(), '\n');
cout << endl <<"Value must be an integer"<< endl << endl; //This line needs <limits>
cout << "Enter another integer value: " ;
}
}
But this code have some inconvenients:
I'm using "#include 'limits'" library and I shouldn't use it
If you enter "1asd" it takes the "1" value, give it like if its correct and it isn't true
Do you guys have any solution for this situation? Thanks in advance for your time.
You just have to check if the bytes that the user entered are numerals like below. If all the bytes of the entered string are numerals (ie between characters 0 and 9), then the entire string is an integer. Except first byte of the string can be a '+', '-', a space/tab or just the first numeral in the number. (Thanks Zett42).
std::cout << "Enter an integer value: ";
std::string res1;
std::cin >> res1;
std::string::iterator it;
for ( it = res1.begin() ; it < res1.end(); it++)
{ std::cout << "checking " << *it << ' ';
if (!( '0' <= *it && *it <= '9' )) {
std::cout << "this is a numeral\n";
} else {
std::cout << "you entered: " << *it << " -- this is *not* a numeral\n";
}
}
I'm trying to open a fullscreen window using SDL2. I've thoroughly looked at the documentation on Display and window management ( https://wiki.libsdl.org/CategoryVideo )... however I don't understand what the best practice would be to get the display resolution I am actually working on.
I have the following sample code:
SDL_DisplayMode mMode;
SDL_Rect mRect;
int ret0 = SDL_GetDisplayBounds(0, &mRect);
std::cout << "bounds w and h are: " << mRect.w << " x " << mRect.h << std::endl;
int ret2 = SDL_GetCurrentDisplayMode(0, &mMode);
std::cout << "current display res w and h are: " << mMode.w << " x " << mMode.h << std::endl;
int ret3 = SDL_GetDisplayMode(0, 0, &mMode);
std::cout << "display mode res w and h are: " << mMode.w << " x " << mMode.h << std::endl;
I am working on a single display that has a resolution of 1920x1080. However, the printed results are:
program output
It seems that SDL_GetDisplayMode() is the only function that displays the correct resolution, so I'd be inclined to use that one. However, I've read that when it comes to SDL_GetDisplayMode(), display modes are sorted according to a certain priority, so that calling it with a 0 returns the largest supported resolution for the display, which is not necessarily the actual resolution (see also: SDL desktop resolution detection in Linux ).
My question is: what is the best practice to obtain the correct resolution?
I am working on a portion of my code that is suppose to output the error message correctly.
Please see below screenshot, I am using on bash
./myProgram < input3a.in | diff -a -y output3a.out -
Left hand side is what I want to get to.
For some reason an extra "|" is printed before the char array 'line' is printed. I suspected that maybe the char array 'line' is not null terminated. but it is initialize by cin.getline(); which should null terminate the char array.
Here i try to print the 'line' array in my main procedure, and it left the | sign on the line before it.
my question is. why does std::cout display this behaviour?
Thanks
EDIT,
Below is my code in question. Thanks for taking a look again.
#include "char_stack.h"
#include <iostream>
void printErrorLine(int errorSpot, int c_count, char line[]){
//Print the first line of error message char by char, at the
//same time replace char with \t or space
for(int x = 0; x <= errorSpot; x++){
std::cout << line[x];
if(line[x] != '\t'){
line[x] = ' ';
}
}
std::cout << std::endl;
//Print out the second line, if the first line does not have a
//errorSpot, then dont print it
if(errorSpot != c_count){
std::cout << line << std::endl;
}
}
char findCounterPart(char bracket){
//pass.
}
int main(int argc, char * argv[]){
char line[250]; // 250 because spec sheet detailed max 250 char per line.
char c;
int l_count = 0; // number of lines already read
int c_count; // character count in a line
char_stack S;
bool isError;
while(!std::cin.peek() == std::cin.eof()){
std::cin.getline(line, 250);
c_count = std::cin.gcount();
l_count +=1;
//std::cout<< c_count << std::endl << std::endl;
//loop through the line
for(int x = 0; x < c_count; x++){
c = line[x];
//std::cout << c << " stack size is " << S.size() << std::endl;
if (c == '(' ||
c == '{' ||
c == '['){
S.push(c);
}
else if(c == ')' ||
c == '}' ||
c == ']'){
if(S.empty()){
std::cout << "Error on line " << l_count << ": Too many " << c << std::endl;
isError = true;
}
else{
char l = S.pop();
if(l != findCounterPart(c)){
std::cout << "Error on line " << l_count << ": Read " << c <<
", expected " << findCounterPart(l) << std::endl;
isError = true;
}
}
}
if (isError){
printErrorLine(x, c_count ,line);
return 0;
}
}
}
if (!S.empty()){
c = S.pop();
std::cout << "Error on line " << l_count << ": Too many " << c << std::endl;
printErrorLine(c_count, c_count , line);
}
else{
std::cout <<"No Errors Found" << std::endl;
}
return 0;
}
Learning to be a software engineer is about breaking problems down into manageable chunks, and here we have a couple of doosies. Lets rephrase your question slightly:
I am getting unexpected characters displayed when diff the output of my program against a file containing output of a previous run. Currently I think this is because of some weird behavior of std::cout.
Well, that might be a reasonable assumption, we can't see your code so we can't know if you're doing anything peculiar.
But it would have to be: std::cout is used, well, all over the place. It just doesn't have this behavior unless your code is deliberately writing a | somewhere.
There are a number of steps we could take to resolve this:
Run the program a 3rd time in the debugger and step through until you have some ideas where the '|' is appearing,
Run the program a 3rd time to the console and observe the output,
View the output using a command like cat, less or more, instead of diff
3 is perhaps the most sensible place to start, since the file is already right there and after that #2 will give us a mk1eyeball check.
What we find is: the | does not appear in the file or the output. It's not coming from your program.
Lets create a couple of .txt files and diff them:
osmith#WOTSIT MINGW64 ~
$ echo -e 'First line\nSecond line' >test1.txt
osmith#WOTSIT MINGW64 ~
$ echo -e 'First line\nFile two line 2' >test2.txt
osmith#WOTSIT MINGW64 ~
$ diff -a -y test1.txt test2.txt
First line First line
Second line | File two line 2
When using the -y switch, between the two columns of output, diff has a line of special characters to indicate lines that changed, were inserted or deleted.
So i have a program which reads the input out of a specific .txt file. The code is:
void Image::get_image_dimensions(char *fname)
{
// determine the number of entries in image
ifstream fin(fname);
fin >> num_rows ;
fin >> num_columns ;
cout << "...reading from file " << fname << endl;
cout << "File has " << num_columns << " rows and "<< num_columns << " columns" << endl;
fin.close();
}
the method is called inside the main.
After I compile the program with VS2010 and run the code everything works properly. But after if I go to the Debug folder of my program and run my program from there it doesn't read the input anymore, and crashes...
What might be the problem?
The current directory is different in the two cases. The best solution is to provide the full path to the file, not just the file name.
I'm trying to write a custom 'driver' for a keyboard (HID, if it matters), under Windows 7. The final goal is having two keyboards connected to the computer, but mapping all of the keys of one of them to special (custom) functions.
My idea is to use libusb-win32 as the 2nd keyboard's driver, and write a small program to read data from the keyboard and act upon it. I've successfully installed the driver, and the device is recognized from my program, but all transfers timeout, even though I'm pressing keys.
here's my code:
struct usb_bus *busses;
struct usb_device *dev;
char buf[1024];
usb_init();
usb_find_busses();
usb_find_devices();
busses = usb_get_busses();
dev = busses->devices;
cout << dev->descriptor.idVendor << '\n' << dev->descriptor.idProduct << '\n';
usb_dev_handle *h = usb_open(dev);
cout << usb_set_configuration(h, 1) << '\n';
cout << usb_claim_interface(h, 0) << '\n';
cout << usb_interrupt_read(h, 129, buf, 1024, 5000) << '\n';
cout << usb_strerror();
cout << usb_release_interface(h, 0) << '\n';
cout << usb_close(h) << '\n';
and it returns:
1133
49941
0
0
-116
libusb0-dll:err [_usb_reap_async] timeout error
0
0
(I'm pressing lots of keys in those 5 seconds)
There's only one bus, one device, one configuration, one interface and one endpoint.
The endpoint has bmAttributes = 3 which implies I should use interrupt transfers (right?)
so why am I not getting anything? Am I misusing libusb? Do you know a way to do this without libusb?
It's pretty simple actually - when reading from the USB device, you must read exactly the right amount of bytes. You know what that amount is by reading wMaxPacketSize.
Apparently a read request with any other size simply results in a timeout.