CCS C Software and Maintenance Offers
FAQFAQ   FAQForum Help   FAQOfficial CCS Support   SearchSearch  RegisterRegister 

ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

CCS does not monitor this forum on a regular basis.

Please do not post bug reports on this forum. Send them to support@ccsinfo.com

5.116 I2C issues
Goto page 1, 2  Next
 
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion
View previous topic :: View next topic  
Author Message
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

5.116 I2C issues
PostPosted: Sun Oct 15, 2023 11:48 am     Reply with quote

PIC18F24Q10.

With 'symbolic' list file option on:

Code:
....................    i2c_start(EXPANDER);
*
05F6:  BSF    SSP1CON2.SEN1
05F8:  BTFSC  SSP1CON2.SEN1
05FA:  BRA    05F8

....................    i2c_stop(EXPANDER);
0618:  BSF    SSP1CON2.ADMSK2
061A:  BTFSC  SSP1CON2.ADMSK2
061C:  BRA    061A


The i2c_start() disassembly is correct. The i2c_stop() disassembly references a bit that doesn't exist within SSP1CON2. If I set the project option to 'CCS Basic' and recompile:

Code:
....................    i2c_stop(EXPANDER);
0618:  BSF    F96.2
061A:  BTFSC  F96.2
061C:  BRA    061A


This demonstrates that, despite the name oopsie, the ic_stop() function is actually referencing the correct bit within SSP1CON2.

Further, there's an issue with how the compiler sets the baud rate within the #use spi functionality:

Code:
#use spi(MASTER, BAUD=3200000, MODE=0, SPI1, MSB_FIRST, STREAM=FRAM, NOINIT)


This amounts to an easily 'hit' perfect divisor of 5 (SSP1ADD should be set to 5 - 1 = 4), and then the SPI clock will be 64MHz (Fosc) / (4 x 5) = 3.2MHz precisely. CCS sets SSP1ADD to 3, which amounts to an SPI clock of 4MHz. I've played with the BAUD values in the #use spi line, and the compiler keeps setting SSP1ADD to 3 for any baud rate I've tried between 4MHz and 2.8MHz. As soon as I try 2.7MHz, then SSP1ADD gets set to 5, which means the SPI clock is 64MHz / (4 x 6) = 2.667MHz. The internal algorithm for whatever reason skips right over a desired baud rate that corresponds to an SSP1ADD setting of 4.
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Sun Oct 15, 2023 1:39 pm     Reply with quote

Another thing I just uncovered....

My project has to rapidly switch between i2c and SPI (1kHz rate) and the processor only has one MSSP. I was hitting a weird race condition where the i2c transaction did actually complete (verified with logic analyzer), but for some reason the MSSP wasn't changing to SPI master mode - not completely, anyway. Any attempt to use the MSSP in SPI mode in this state would result in a locked processor. The code would hang in spi_transfer_write().

My switchover from i2c to SPI and back entails PPS changes in addition to killing the MSSP and then reconfiguring it in the proper mode (either SPI or i2c). I could see that it would very consistently hang just after the completion of an i2c transaction. Given that i2c is slower than SPI (for my case anyway), and there's no way that I could see to test if the i2c() transaction had actually completed, I tried adding a delay before doing the changeover to SPI. That never worked. In desperation I tried something of which I'm not proud - invoking the spi initialization twice. That worked. Now the stupid thing no longer hangs.

Initially I used the CCS built-in function spi_init(), and it exhibited this hanging behavior. I then 'rolled my own' SPI init routine, and the behavior persisted. I've rolled back to the CCS spi_init(), and if I leave *one* instance of it, the SPI still exhibits this lockup. If I include *two*, one after the other, then the problem goes away.

Do you think this may be some sort of undocumented errata with the MSSP peripheral of the PIC18F24Q10?
temtronic



Joined: 01 Jul 2010
Posts: 9113
Location: Greensville,Ontario

View user's profile Send private message

PostPosted: Sun Oct 15, 2023 2:50 pm     Reply with quote

Man I feel sorry for the PIC, I'm I2c, no, ,SPI ,no I2C, no SPI, arrrgh,
Poor thing is torn between the two, easy for a bit to not be in the right place.sigh.


Yes, I've done the 'double init' before....WITH a delay inbetween !
Ok...., I have to ask...
WHY didn't you choose a PIC with separate I2C and SPI ports ???
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Sun Oct 15, 2023 3:28 pm     Reply with quote

Parts availability. Initially I only required SPI but a HW revision left me in need of pins the processor didn't have, so reluctantly I added the i2c IO expander. Space on the PCB is very, very tight and I cannot update to a large PIC with more pins.

So far it's been about 2h with the update and no hangs yet. I think I have it licked.
temtronic



Joined: 01 Jul 2010
Posts: 9113
Location: Greensville,Ontario

View user's profile Send private message

PostPosted: Sun Oct 15, 2023 4:39 pm     Reply with quote

'grow-itis'... sigh.... BTDT...
Hopefully your code issues are SOLVED......
Ttelmah



Joined: 11 Mar 2010
Posts: 19225

View user's profile Send private message

PostPosted: Mon Oct 16, 2023 1:55 am     Reply with quote

OK.
The name thing, should simply be reported to CCS. It is setting the right
bit, but for some reason the name database has got screwed on this chip.
ADMSK2 is used as the slave address mask select bit on some of the
chips (2550 etc.). It is mask 2, so the right bit as you have found.
It really should be SSPM2.
Report the SPI clock selection issue as well.

I must admit you are probably the first person to be trying to switch the
MSSP from SPI to I2C 'on the fly', so finding an undocumented issue
is perhaps not surprising. I suggest you report this to MicroChip.

Why do the MSSP switch?.
Seems pointless. why not just use software I2C?. You can simply set
this up on the same pins, and then use a pin_select 'none' to turn off
the SPI selection, then talk to the software I2C. My guess would be that
the change from open collector setup to bi-directional drive is resulting
in something like a spurious clock pulse on the SPI. A second init,
clears the register.
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Mon Oct 16, 2023 9:52 am     Reply with quote

CCS alerted to the minor issues and Microchip support ticket generated.

I shied away from using software I2C for a bunch of reasons but the primary one is that the project is battery operated; I put the processor into doze mode every chance I get to save power. Bit-banging I2C at a high rate means a lot of clock cycles and a shorter battery lifetime.
jeremiah



Joined: 20 Jul 2010
Posts: 1317

View user's profile Send private message

PostPosted: Mon Oct 16, 2023 1:02 pm     Reply with quote

I've used software I2C in multiple low power projects (avg current draw < 1uA)...mostly for EEPROMS and peripheral chips. I haven't seen a noticeable power increase over hardware implementaitons. Generally any time spent in the SW routines has been mimic'ed by the hardware routines because it has to wait on polling the signal lines to detect ACK/NACK and release of the lines.

Have you bench marked it both ways in your code? I realize every software program is different, but was curious.
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Mon Oct 16, 2023 1:28 pm     Reply with quote

For this project no, but I have benchmarked HW based I2C vs bitbanged I2C for a prior project. That one was a LoRA device which interfaced to an I2C sensor. It was asleep the majority of the time (full sleep, no clock running at all).

I benchmarked the final version with a pair of fresh AA alkaline batteries, and it managed to transmit 'several hundreds of thousands' of times. I'm using quotes because I don't remember the exact number, but I want to say it was 300,000+. The test ran several months (something like 8 months) and the device was set to transmit once per minute. This was with using the HW I2C. I did try benchmarking an identical unit (hardware wise) with a bitbanged I2C (CCS' soft implementation of I2C) and the batteries ran down significantly faster. I cannot recall the exact details but I know that the batteries died on that one at something like 1/3rd of the transmissions I was able to achieve using the HW I2C.
Ttelmah



Joined: 11 Mar 2010
Posts: 19225

View user's profile Send private message

PostPosted: Tue Oct 17, 2023 12:52 am     Reply with quote

The software I2C, if running at the same rate draws no more power than
the hardware I2C. Like Jeremiah I've run both. What matters is the time
the chip is awake. If you are running hardware the chip is still awake
running clock cycles waiting for the hardware. If software, the same clock
cycles doing the I/O. In fact if you put the peripheral to sleep and run
software I2C, the consumption actually drops a little.....
The extra peripheral draws power.
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Tue Oct 17, 2023 5:44 am     Reply with quote

Understood, but what I've always done for ultra low power situations is not 'block' waiting for an i2c transaction to complete. Enable the appropriate interrupt, start the transaction, and then doze. When complete, the processor starts getting clocked again, and then you can initiate the next transaction in the same manner. When you're all done, use the PMD to kill the peripheral entirely.

^ If using software i2c, that extra bit of time that the processor is up & fully running, over the lifetime of a battery, adds up to quite a bit...when I last benchmarked it anyway. I luckily had a very competent 'ultra low power' mentor some time back; every clock cycle was carefully considered and accounted for. He developed a particular product that was powered from a coin cell (2032) and the lifetime of the product almost equaled the self discharge of the battery. I know he had one customer whose product lasted ~15 years, but I think that particular one was powered from a pair of 2032s IIRC. He even 'rolled his own' clock as what he developed drew a fraction of the current that the processor's internal oscillator drew.
Ttelmah



Joined: 11 Mar 2010
Posts: 19225

View user's profile Send private message

PostPosted: Tue Oct 17, 2023 7:13 am     Reply with quote

OK. That makes sense. However that you are seeing much power difference
implies your chip must be waking rather a lot. Normally the difference
between doze and doing it directly using the CPU is not enough to reduce
battery life noticeably. Doze is also one of the things that commonly has
a lot of errata on many PIC's, so is worth avoiding if you can. Are you using
Doze mode when you have the switching problems with MSSP/I2C?. If
so it'd be worth trying without this and seeing if the behaviour changes.
Most of the chips that support doze mode have not been around long
enough to really test battery life.

I have to ask how you handle the I2C pull-ups. If these are enabled when
you are running the bis as SPI, this will waste significant power.
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Tue Oct 17, 2023 8:51 am     Reply with quote

Haven't implemented doze yet - everything is running full out for dev; power savings will be one of the last things I implement. I have used doze quite extensively in the past with great success, but on a different set of processors from this present one.

The PCB has two line groups: one for SPI and the other for I2C. Separate pins on the PIC. I didn't want to 'dual purpose' lines that ran to both my SPI accelerometer and the IO expander. I also wanted to ensure that layout was as close to 'ideal' as possible to ensure good signal integrity for the SPI in particular. This product has to work no matter what.

The more I think about it, using software I2C would have been the easiest way out. I always default to trying to use peripherals no matter what and that set of 'blinders' is what has guided this progression. At any rate, I did stumble upon a fix; I'm now awaiting Microchip's word regarding why configuring the MSSP as an SPI master when it had been an I2C master prior (and then disabled entirely) doesn't always 'take' the first time. Interestingly, configuring the MSSP as an I2C master always works the first time. My experiment has been running continuously for almost 48h without a hiccup since I stumbled upon the 'fix'.
jeremiah



Joined: 20 Jul 2010
Posts: 1317

View user's profile Send private message

PostPosted: Tue Oct 17, 2023 9:08 am     Reply with quote

What you could be seeing is when switching to the new mode that the old mode is still finishing up its last request(s) so it is getting confused.

Out of curiosity, did you try polling the I2C registers to make sure it was completely finished transmitting / reading before initializing the SPI registers? There should be some status bits that tell you if a transaction is in process. It may or may not help, but something to at least play with. I think someone mentioned a delay between (which is a quick and dirty way to approximate the polling, but also a valid thing to test).
newguy



Joined: 24 Jun 2004
Posts: 1902

View user's profile Send private message

PostPosted: Tue Oct 17, 2023 9:16 am     Reply with quote

Good idea, but on this processor at least, there's no status bit anywhere to let you know that a transaction is still 'in progress.' I looked - believe me.

I did put in delays before doing the switch to SPI if I2C mode was active, but that had no positive effect. Ridiculously long delays - on the order of 200us - didn't make the problem go away.
Display posts from previous:   
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion All times are GMT - 6 Hours
Goto page 1, 2  Next
Page 1 of 2

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group