Setting GPIO in MLO does not work - boot

I want to know how to pin multiplex pins in initial phase of boot i.e is in spl(MLO).
What I am trying to do is change the default pin configuration to gpio one, so that I can see high or low on the pin.
On P8 header I tried to change the mode 0 from default 'TIMER4' to gpio2[2] i.e mode 7. So I did this
static struct module_pin_mux gpio2_2_pin_mux[] = {
{OFFSET(gpmc_wen), (MODE(7) | PULLUDEN)},
{-1},
};
and called this function
configure_module_pin_mux(gpio2_2_pin_mux);
in board/ti/am335x/mux.c
I didn't saw any voltage on 7th pin of P8 header?
What is the correct way to do this?
file link : http://textuploader.com/5eh6u
you can search with '?' in the file to see what I added.
P.S
I checked with checking pin mux setting on uart0 and tried to read it if that is same.
So I wrote this in
./arch/arm/cpu/armv7/omap-common/boot-common.c
void spl_board_init(void)
{
/*
* Save the boot parameters passed from romcode.
* We cannot delay the saving further than this,
* to prevent overwrites.
*/
save_omap_boot_params();
unsigned int *mfi;
//control revision register
/* Prepare console output */
mfi = *(unsigned int *)(0x44E10980);
printf("1======> %x\n",mfi);
preloader_console_init();//it will print uboot version date and time information
mfi = *(unsigned int *)(0x44E10980);
printf("2======> %x\n",mfi);
more init code.....
}
I wanted to see this setting done in board/ti/am335x/mux.c
static struct module_pin_mux uart0_pin_mux[] = {
{OFFSET(uart0_rxd), (MODE(0) | PULLUP_EN | RXACTIVE)}, /* UART0_RXD */
{OFFSET(uart0_txd), (MODE(0) | PULLUDEN)}, /* UART0_TXD */
{-1},
}
But it printed value as 37. that means the pin is in GPIO mode.
How is this possible that the pin that should be in mode 0 is in 7th mode?

Related

How to use the writeback connector by the Linux DRM api /libdrm?

I recently start learning Direct Rendering Manager and tried to use the writeback connector and the libdrm.
There are some documents and codes of how the kernel implements it, but not enough of how the userspace uses the api, such as drmModeAtomicAddProperty and drmModeAtomicCommit of the writeback connector in libdrm.
I have referred to libdrm/tests/modetest; Linux API Reference; linux/v6.0-rc5/source/drivers/gpu/drm/drm_writeback.c and some patch imformation.
I used modetest to get some driver information and tried my code using the libdrm:
/* I previously get the value of writeback connector:wbc_conn_id
* and create an output framebuffer:fb_id
* and find the active crtc: crtc_id
* next to get writeback connector property*/
props = drmModeObjectGetProperties(fd, wbc_conn_id, DRM_MODE_OBJECT_CONNECTOR);
printf("the number of properties in connector %d : %d \n", wbc_conn_id, props->count_props);
writeback_fb_id_property = get_property_id(fd, props,"WRITEBACK_FB_ID");
writeback_crtc_id_property = get_property_id(fd, props,"CRTC_ID");
printf("writeback_fb_property: %d\n",writeback_fb_id_property);
printf("writeback_crtc_id_property: %d\n",writeback_crtc_id_property);
drmModeFreeObjectProperties(props);
/* atomic writeback connector update */
req = drmModeAtomicAlloc();
drmModeAtomicAddProperty(req, wbc_conn_id, writeback_crtc_id_property, crtc_id);
printf ("%d\n",ret);
drmModeAtomicAddProperty(req, wbc_conn_id, writeback_fb_id_property, buf.fb_id);
ret = drmModeAtomicCommit(fd, req, DRM_MODE_ATOMIC_ALLOW_MODESET, NULL);
if (ret) {
fprintf(stderr, "Atomic Commit failed [1]\n");
return 1;
}
drmModeAtomicFree(req);
printf("drmModeAtomicCommit Set Writeback\n");
getchar();
It turns out that the drmModeAtomicCommit failed. Was there any property set wrongly or missed? The value of two steps of addproperty is 1 and 2, and the atomic commit returned EINVAL -22 .
I've looked around but found no solution or similar question about the property set of writebackconnector.

How to use multi-core for the MPC5748G?

I'm trying to implement the LZMA algorithm (compression/decompression algorithm) in the MPC5748G, however i need an example on how to use more than one core since there are 2 cores of 160Mhz
I'm using LZMA to reduce flashing time, the file firstly is compressed and then sent to the MPC ... it should then decompress the file and perform the flashing operation.
The algorithm need to run on a separate core because the other core are doing other things and contains a lot of tasks.The results are not very good and the decompression takes too much time.
as #marcus commented: the problem is not to write an lzma decoder, but to run it on a different core
Any help for using the other core will be very helpful.
How about Core_Boot(void)?
/*******************************************************************************
Function Name : Core_Boot
Engineer : Lukas Zadrapa
Date : Apr-20-2016
Parameters : NONE
Modifies : NONE
Returns : NONE
Notes : Start e200z4b and e200z2 cores
Issues : NONE
*******************************************************************************/
void Core_Boot(void)
{
/* Enable e200z4b and e200z2 cores in RUN0-RUN3, DRUN and SAFE modes */
MC_ME.CCTL[2].R = 0x00FC; /* e200z4b is active */
MC_ME.CCTL[3].R = 0x00FC; /* e200z2 is active */
/* Set start address for e200z4b and e200z2 cores */
MC_ME.CADDR[2].R = E200Z4B_BOOT_ADDRESS | 1; /* e200z4b boot address + RMC bit */
MC_ME.CADDR[3].R = E200Z2_BOOT_ADDRESS | 1; /* e200z2 boot address + RMC bit */
/* Mode change - re-enter the DRUN mode to start cores */
MC_ME.MCTL.R = 0x30005AF0; /* Mode & Key */
MC_ME.MCTL.R = 0x3000A50F; /* Mode & Key inverted */
while(MC_ME.GS.B.S_MTRANS == 1); /* Wait for mode entry complete */
while(MC_ME.GS.B.S_CURRENT_MODE != 0x3); /* Check DRUN mode entered */
}//Core_Boot
Do you need to exchange data between cores? Regards

atmega8 UART- doesn't show character in realterm

Hi i'm new to this and i need help. It's suppose to just show the 'S' in the realterm instead it gives 'null'. What would be the problem? could it be the register? or the code itself?
#include <avr/io.h>
#include <util/delay.h>
void UART_Init(unsigned int ubrr)
{
UBRRH=(unsigned int)(ubrr>>8);
UBRRL=(unsigned int)ubrr;
UCSRA=0x00;
UCSRB=(1<<TXEN)|(1<<RXEN);
UCSRC=(0<<USBS)|(1<<UCSZ0)|(1<<UCSZ1);
}
void UART_Tx(unsigned char chr)
{
while (bit_is_clear(UCSRA,UDRE)){}
UDR=chr;
}
int main(void)
{
UART_Init(95);
DDRD|=0B11111111;
PORTD|=0B11111111;
while(1){
_delay_ms(10);
UART_Tx('S');
}
}
System is running on xtal with 14745600 Hz. Speed on host is 9600 baud. all settings should be 8N1.
You need to set the URSEL when writing to the UCSRC register.
Change
UCSRC=(0<<USBS)|(1<<UCSZ0)|(1<<UCSZ1);
to
UCSRC=(1<<URSEL)|(0<<USBS)|(1<<UCSZ0)|(1<<UCSZ1);
From the data sheet:
The UBRRH Register shares the same I/O location as the UCSRC Register. Therefore some
special consideration must be taken when accessing this I/O location. When doing a write access of this I/O location, the high bit of the value written, the USART Register Select (URSEL) bit, controlswhich one of the two registers that will be written. If URSEL is
zero during a write operation, the UBRRH value will be updated. If URSEL is one, the UCSRC
setting will be updated.
The rest of the code looks fine to me.
change UART_Tx('S'); using UART_Tx("S");

PIC16F877 + 24LC64 via i2c

My task is to copy first 255 bytes from external EEPROM (24LC64) to internal (PIC16F877) via i2c bus. I've read AN1488, all datasheets, MikroC gide (oh, yes, I'm using MikroC), but hopeless.. Meaning that my code trys to read smtng but then, reading my PIC's eeprom at programmer (which can't read 24LC64, so I don't even know what's on it, but there is smtng defenately and it is different from what i'm getting), and I'm getting all EEPROM filled by "A2" or "A3". My guess is that it's that first addr, by which I'm addressing to 24LC64. Could you pls inspect my code (it's quite small =)) and point me at my misstakes.
char i;
unsigned short Data;
void main(){
PORTB = 0;
TRISB = 0;
I2C1_Init(100000);
PORTB = 0b00000010;
for (i = 0x00; i<0xFF; i++) {
I2C1_Start();
I2C1_Wr(0xA2); //being 1010 001 0
//I'm getting full internal EE filled with what's in brackets from above
I2C1_Wr(0b00000000);
I2C1_Wr(i);
I2C1_Repeated_Start();
I2C1_Wr(0xA3); //being 1010 001 1
Data = I2C1_Rd(0);
I2C1_Stop();
EEPROM_write(i, Data); //How could that 1010 001 0 get into here???
Delay_100ms();
}
PORTB = 0b00000000;
while (1) {
}
}
P.S. I've tryed this with sequantial read, but it "reads" (again that "A2"..) only 1st byte.. So i've posted this one..
P.S.S. I`m working in "hardware", no Proteus involved..
P.S.S.S. I can't test writing, because I have only one 24LC64 with important info on it, so it's even pulld up to Vcc on it's WP pin...
This isn't a specific answer but more of a checklist for I2C comms, since it's difficult to help with your problem without looking at a scope and without delving into the API calls that you've provided.
Check the address of your EEPROM. I2C uses a 7-bit address with a R/W bit appended to the end, so it's easy to make a mistake here.
Check the command sequence that your EEPROM expects to receive for a "data read"
Check how the I2C_ API that you're using deals with acks from the EEPROM. They need to be handled somewhere (usually in an ISR) and it's not obvious where they're dealt with from your example.
Check that you've got the correct pull-ups on SDA and SCL as per the requirements of your design - they're needed for I2C to work.

Configure Linux I2C Speed

I am using I2C on the Snowball board, running at 400KHz by default and would like to reduce this to 100KHz.
I use the api defined in and configure as follows
m_fd = open(m_filename.c_str(), O_RDWR);
if (ioctl(m_fd, I2C_SLAVE_FORCE, m_addr) < 0)
{
throw I2cError(DeviceConfigFail);
}
Does anyone know how I would go about changing the speed to standard mode.
Thanks
You can change the I2C SCL frequency in your driver's 'struct i2c_gpio_platform_data'.
static struct i2c_gpio_platform_data xyz_i2c_gpio_data = {
.sda_pin = GPIO_XYZ_SDA,
.scl_pin = GPIO_XYZ_SCL,
.udelay = 5, //#udelay: signal toggle delay. SCL frequency is (500 / udelay) kHz
....
};
Changing 'udelay' changes your 'xyz' i2c device's clock frequency.
You should change the I2C Frequency in driver source file of the corresponding peripheral (ie: Slave device to which you are communicating through I2C. Example: EEPROM/Camera etc.)
You may find some macro defined in that driver source code... like:
#define EEPROM_I2C_FREQ 400000 //400KHz
Change it to:
#define EEPROM_I2C_FREQ 100000 //100KHz
Only for that corresponding driver, I2C frequency/speed will be changed.

Resources