Use of Xil_Out32 in Xilinx SDK - fpga

In Vivado I succesfully made a simple blockdiagram to control the LEDs of my Zybo board. I can observe that the offset address for my LEDs is: 0x4120 0000 and the High Address is 0x4120 FFFF. Now when I go to the SDK:
#include <xil_printf.h>
#include <xil_types.h>
#include "platform.h"
#include "xgpio_l.h"
volatile u32 *LED_DATA = (u32 *) 0x41200000 ;
int main()
{
init_platform();
xil_printf(" Writing to LEDs: \n\r");
Xil_Out32((&LED_DATA) + (0x00) , 0xFFFFFFFF); //All LEDs ON
cleanup_platform();
return 0;
}
I programmed the FPGA and run the above code. But still no success whatsoever.
Could someone point out my errors?
Thanks in advance

Your mistake is to use &LED_DATA, which return the address of the pointer LED_DATA, not 0x41200000 as I think you expect.
Try
Xil_out32(0x41200000, 0xFFFFFFFF);
or
*LED_DATA = 0xFFFFFFFF;

try
#define ADDR 0x41200000 // write this before main() function.
Then you have to add the following line within main function.
Xil_Out32(ADDR + 0x00000000) , 0xFFFFFFFF); //All LEDs ON
This should work.

This work
#define ADDRESS_GPIO_0 0x41200000 // vivado block diagram address editor
XGpioPs_Config * ConfigPtr1 = XGpioPs_LookupConfig(XPAR_PS7_GPIO_0_DEVICE_ID);
XGpioPs_CfgInitialize(&Gpio1, ConfigPtr1, ADDRESS_GPIO_0);
XGpioPs_SetDirection(&Gpio1, XGPIOPS_BANK0, 0x0F);
XGpioPs_Write(&Gpio1, XGPIOPS_BANK0, 0x0F);

Thank you for this post. It helped me resolve a compile issue in sdk. The issue was that the line below would not compile.
xil_printf("Wrote: 0x%08x \n\r", *(baseaddr_p+0));
I added this and it worked:
include "xil_printf.h"
Thanks so much
Rajat Sewal

Related

use RNG library in stm32f4xx

I want to write simple code to generate random number with built-in hardware in stm32f4xx discovery board. I wrote the code below but it does not work. It sticks in inner while loop and the flag never set to jump out of loop.
#include <stm32f4xx.h>
#include <stm32f4xx_rng.h>
#include <stm32f4xx_rcc.h>
void RNG_Config(void)
{
/* Enable RNG clock source */
RCC_AHB2PeriphClockCmd(RCC_AHB2Periph_RNG, ENABLE);
/* RNG Peripheral enable */
RNG_Cmd(ENABLE);
}
int main(void)
{
uint32_t temp = 0;
RNG_Config();
while(1)
{
while (RNG_GetFlagStatus(RNG_FLAG_DRDY) == RESET);
temp = RNG_GetRandomNumber();
}
}
Simply study STMicroelectronics examples at
STM32CubeH7-master\Projects\NUCLEO-H743ZI\Examples\RNG\RNG_MultiRNG
The code can be download from github. Google search STM32CubeH7-master and github
I have solved this problem myself by adding SystemInit() in the beginning of main function.

AVR studio7 declaration error

i need your help
i am trying atmega128a using AVR studio 7
but one problem is there
when i control DDRB and PORTB into main()
it works fine
but if i control DDRB and PORTB out of main()
if becomes error
'expected identifier or '(' before volatile'
i just want to know that why always handing DDRB and PORTB is only in main()
here is my code
#define F_CPU 14745600UL
#include <avr/io.h>
#include <util/delay.h>
DDRB = 0xFF;
PORTB = 0x00;
int main(void)
{
/* Replace with your application code */
PORTB = 0x01;
_delay_ms(300);
while (1)
{
PORTB <<= 1;
_delay_ms(300);
if(PORTB == 0x80){
PORTB = 0x01;
_delay_ms(300);
}
}
}
C is not a scripting language. Any line of code that actually runs must be inside a function. You can make a new function and call it from main.
They need to be assigned inside of a function because they are macros that end up getting replaced with something that looks like this:
(*(volatile uint8_t *)<address>)
where <address> is a memory address that corresponds to the register that you are trying to access. You are trying to cast and dereference a pointer, which is not a valid operation outside of a function.

Control relay from PIC18 microchip

I have a PIC18F24K20 microchip, and wants to control a relay. It works fine from my RasPI over GPIO - but i cant get it working trough my microchip.
My test program is this:
#include <xc.h>
#define R1 LATBbits.LATB0
#define R1_TRIS TRISBbits.RB0
#define R2 LATBbits.LATB1
#define R2_TRIS TRISBbits.RB1
void main(void) {
R1_TRIS = 0;
R2_TRIS = 0;
R1 = 1;
R2 = 0;
return;
}
What is im doing wrong?
replace the return;
with:
while(1)
{
ClrWdt();
}
according datasheet,RB0 and RB1 have several modules connected to these pins,so you should verify they are turned off:
Analog,
ECCP,
Comparator.
BTW why using two pins in order to control one relay?
3.you may need add driver in order to operat the relay.
according datasheet, add following initialization code:
CCP1CON=0;
CCP2CON=0;
ADCON0=0;
CM1CON0=0;
CM2CON0=0;
also PBADEN bit at configuration bit should be zero.
The main function should never return in the embedded PIC processors. In some implementations, it would cause a software reset which would cause your pins to go back to high impedance mode. Try adding while (1); at the end of your main.
Check if the used pins have other functions. The typical gotcha is that the pins double as analog pins and are enable by default.
Disable them by looking up which AN pin they correspond to in the datasheet and disable them with code like
ANSEL.ANS0 = 0;
ANSEL.ANS1 = 0;
If you enable watchdog functionality you also might want to add a
ClrWdt();
to the main WHILE loop (which was a good suggestion from Mathieu)

EEPROM in AVR doesn't work

I'm a beginner in C language. I'm trying to operate on EEPROM memory in my ATmega 8 and ATtiny2313.
Based on this tutorial I've created the following codes:
1) writes a number to place 5 in uC's eeprom
#define F_CPU 1000000UL
#include <avr/eeprom.h>
int main()
{
number=5;
eeprom_update_byte (( uint8_t *) 5, number );
while (1);
{
}
}
2) blinks the LED n times, where n is the number read from place 5 in eeprom
#define F_CPU 1000000UL
#include <avr/io.h>
#include <util/delay.h>
#include <avr/eeprom.h>
int main()
{
DDRB=0xFF;
_delay_ms(1000);
int number;
number=eeprom_read_byte (( uint8_t *) 5) ;
for (int i=0; i<number; i++) //blinking 'number' times
{
PORTB |= (1<<PB3);
_delay_ms(100);
PORTB &= (0<<PB3);
_delay_ms(400);
}
while (1);
{
}
}
The second program blinks the led many times, and it's never the amount which is supposed to be in eeprom. What's the problem? This happens in both atmega8 and attiny2313.
EDIT:
Console results after compilation of the first program:
18:01:55 **** Incremental Build of configuration Release for project eeprom ****
make all
Invoking: Print Size
avr-size --format=avr --mcu=attiny2313 eeprom.elf
AVR Memory Usage
Device: attiny2313
Program: 102 bytes (5.0% Full)
(.text + .data + .bootloader)
Data: 0 bytes (0.0% Full)
(.data + .bss + .noinit)
Finished building: sizedummy
18:01:56 Build Finished (took 189ms)
That is one of the every time failures for beginners :-)
If you compile simply with avr-gcc <source> -o <out> you will get the wrong results here, because you need optimization! The write procedure MUST be optimized to fulfil the correct write access! So please use '-Os' or '-O3' for compiling with avr-gcc!
If you have no idea if your problem comes from read or write the eeprom, read your eeprom data with avarice/avrdude or similar tools.
The next pitfall can be, that you erase your eeprom section if you program your flash. So please have a look what your programmer really do! A full chip erase erases the eeprom as well.
Next pitfall: What fuses you have set? You are running with the expected clock rate? Maybe you have programmed internal clock and your outside crystal seems to be working with wrong speed?
Another one: Just have a look for the fuses again! JTAG pins switched off? Maybe you see only JTAG flickering :-)
Please add the compiler and programming commands to your question!

USART problems with ATmega16

I have a ATMega16 and have looped the Rx Tx (just connected the Rx to the Tx), to send and receive one char in a loop. But i only seems to be receiving 0x00 instead of the char i send.
I have the CPU configured to 1MHz.
But my thought is that since the Rx and Tx are just looped, it shouldn't matter what speed i set, since both are the same?
So basically, I'm trying to get a LED to flash at PORTC when receiving the correct char.
Here is the code:
#ifndef F_CPU
#define F_CPU 10000000
#endif
#define BAUD 9600
#define BAUDRATE ((F_CPU)/(BAUD*16)-1)
#include <avr/io.h>
#include <util/delay.h>
void uart_init(void){
UBRRH = (BAUDRATE>>8);
UBRRL = BAUDRATE;
UCSRB = (1<<TXEN) | (1<<RXEN);
UCSRC = (1<<URSEL) | (1<<UCSZ0) | (1<<UCSZ1);
}
void uart_transmit (unsigned char data){
while (!(UCSRA & (1<<UDRE)));
UDR = data;
}
unsigned char uart_recive(void){
while(!(UCSRA) & (1<<RXC));
return UDR;
}
int main(void)
{
uart_init();
unsigned char c;
PORTC = 0xff;
DDRC = 0xff;
while(1)
{
_delay_ms(200);
uart_transmit(0x2B);
c = uart_recive();
if(c==0x2B){
PORTC = PORTC ^ 0xff;
}
}
}
Any thoughts of what i am doing wrong?
The code seems right.
Thing you may have to check:
if your baudrate is the one you should have
if you try to send a char like 'p'; now you are sending a '+'
check your port configuration and see if it matches to your configuration
I think the last one is the problem.
You can try this code from ATMega manual:
/* Set frame format: 8data, 2stop bit */
UCSRC = (1<<URSEL)|(1<<USBS)|(3<<UCSZ0);
After building your program, go to your port configuration and make sure it it set on 8 bits data format and 2 stop bits. Then test it on you microcontroller and see what happens. Please come back with the result.
Consider real baudrate accuracy. See e.g. http://www.wormfood.net/avrbaudcalc.php?postbitrate=9600&postclock=1, AVR provides 7.5% error for 9600baud # 1MHz clock, which is rather high error. Depend what you are sending and receiving. "Normally" you can see a garbage, if you receive permanently 0x00s it looks like another problem.
your F_CPU is set to 10MHz.
you sad that it is configured to 1Mhz.
Also check your Fuses if you really activated the crystal.
If you just use the internal oscillator: it has a relatively large error so that your UART timings may be broken (i never got problems using internal oscillator for debugging).
Another source of error may be your F_CPU definition. Mostly this Preprocessor constant is defined already (propably also wrong) in Makefile (or in IDE project settings) so your #define in Code has not affect since the #ifndef
PORTC pins(TDI,TMS,TCK,SDA) always high because these pins for JTAG and JTAG is enable by default. if you want to use PORTC in your application you have to Disable the JTAG by setting fuse bit. for atmega16 JTAGEN is fuse bit set it to 1(means unprogrammed). in case of fuse bit 0(means programmed) and 1(means unprogrammed) one more thing if you use more than 8MHz you have to set fuse bit otherwise your program will give unexpected or wrong result thanks.

Resources