CCS C Software and Maintenance Offers
FAQFAQ   FAQForum Help   FAQOfficial CCS Support   SearchSearch  RegisterRegister 

ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

CCS does not monitor this forum on a regular basis.

Please do not post bug reports on this forum. Send them to support@ccsinfo.com

Questions about interrupts.

 
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion
View previous topic :: View next topic  
Author Message
GDetienne



Joined: 20 Sep 2003
Posts: 47
Location: Brussel - Belgium

View user's profile Send private message

Questions about interrupts.
PostPosted: Wed Feb 11, 2004 8:52 am     Reply with quote

I use a 16F876 with one 16 Mhz quartz.
My program use 3 Interrupts : B0, timer0 and RS232 (or I2C).
In the interrupts routines of B0 (one interrupt each second) and timer0 (one interrupt each mSec), I increment a counter and set a flag. The rest of the code is in the main program.
For RS232, I read the first character and set a flag. In the main program I desable this interrupt and read the rest of characters. For I2C I use the interrupt to read all the data.

I have questions:

1) What append when one interrupt is comming when the program run in another interrupt ? Is the second interrupt treated after the first one ? I think that when he start one interrupt, this one is desabled, but what about the global interrupt ?

2) What append if I use I2C (in this case the program read each charater with interrupt) and another interrupt is coming ? If a second interrupt is coming the program use the priority? Then I go to another routine, for a short time, but what append with the I2C transmission ?

3) How is it possible to calculate the time used to send one (or more) characters with RS232 (9600 bauds for exemple) ? I used HW transmission with this Pic. I think that the RDA interrupt is not desabled when he send.

Thanks to give me more info about those general problems.
Ttelmah
Guest







Re: Questions about interrupts.
PostPosted: Wed Feb 11, 2004 10:09 am     Reply with quote

GDetienne wrote:
I use a 16F876 with one 16 Mhz quartz.
My program use 3 Interrupts : B0, timer0 and RS232 (or I2C).
In the interrupts routines of B0 (one interrupt each second) and timer0 (one interrupt each mSec), I increment a counter and set a flag. The rest of the code is in the main program.
For RS232, I read the first character and set a flag. In the main program I desable this interrupt and read the rest of characters. For I2C I use the interrupt to read all the data.

I have questions:

1) What append when one interrupt is comming when the program run in another interrupt ? Is the second interrupt treated after the first one ? I think that when he start one interrupt, this one is desabled, but what about the global interrupt ?

2) What append if I use I2C (in this case the program read each charater with interrupt) and another interrupt is coming ? If a second interrupt is coming the program use the priority? Then I go to another routine, for a short time, but what append with the I2C transmission ?

3) How is it possible to calculate the time used to send one (or more) characters with RS232 (9600 bauds for exemple) ? I used HW transmission with this Pic. I think that the RDA interrupt is not desabled when he send.

Thanks to give me more info about those general problems.


The first answer, would be to read the chips data sheet...
However I will give you some general comments.
When an interrupt triggers, the corresponding interrupt flag is set. If the global interrupt enable is set, and the interrupt enable for this event is set, the global interrupt handler will be called. At this point the _global_ interrupt flag is automatically disabled. The handler then saves the variables and registers that need to be preserved, and then starts checking the interrupt flags to see which one is set. The order in which this is done is controlled by the 'priority' setting. The first one that is found set, which also has it's enable set, then calls the corresponding handler. When the handler returns, the corresponding interrupt flag is cleared, the variables and registers are then restored, and the global interrupt routine returns to the calling point, _setting the global interrupt enable as it does so_.
At this point if another interrupt flag is set, the sequence will repeat.
So the 'priority' setting affects only the order in which the flags are checked in the handler. An interrupt cannot interrupt a handler.
On latter chips (the 18 family), there is a seperate 'hardware' priority ability, which does allow an interrupt to interrupt another handler.
The time from the interrupt 'event', to actually arriving in the corresponding handler, will typically be perhaps 80 to 100 clock cycles. Similarly when the handler finishes, there will be perhaps another 70 to 90 clock cycles needed for the 'tidying up' before the global handler exits.
The time taken for serial transmission, is the start bit, plus the number of bits in the character, plus the parity bit (if used), plus the stop bit. Hence for '8 bit no parity, one stop', the total time is 10 bit times. The bit time is 1/bps. So at 9600bps, the total time (for 8N1), is 10/9600 second.
You really do need to read the data sheet for the chip, to understand how the hardware works, and look at many of the application notes (from MicroChip), and examples (from CCS), and start to understand both the abilities and limitations of the chips.

Best Wishes
GDetienne



Joined: 20 Sep 2003
Posts: 47
Location: Brussel - Belgium

View user's profile Send private message

PostPosted: Thu Feb 12, 2004 2:00 pm     Reply with quote

Many thanks for your answer and ...... your suggestion.

Regard
willie.ar



Joined: 21 Jan 2004
Posts: 15
Location: Argentina

View user's profile Send private message

Re: Questions about interrupts.
PostPosted: Mon Mar 08, 2004 9:06 am     Reply with quote

Ttelmah, you gave an excellent description of how interrupts work but I can not understand why you say
"The time from the interrupt 'event', to actually arriving in the corresponding handler, will typically be perhaps 80 to 100 clock cycles". I'm experiencing something like that and I'm wondering why. I need exact delays.

Shocked

can you explain a little bit more about that?
tks


GDetienne wrote:
I use a 16F876 with one 16 Mhz quartz.
My program use 3 Interrupts : B0, timer0 and RS232 (or I2C).
In the interrupts routines of B0 (one interrupt each second) and timer0 (one interrupt each mSec), I increment a counter and set a flag. The rest of the code is in the main program.
For RS232, I read the first character and set a flag. In the main program I desable this interrupt and read the rest of characters. For I2C I use the interrupt to read all the data.

I have questions:

1) What append when one interrupt is comming when the program run in another interrupt ? Is the second interrupt treated after the first one ? I think that when he start one interrupt, this one is desabled, but what about the global interrupt ?

2) What append if I use I2C (in this case the program read each charater with interrupt) and another interrupt is coming ? If a second interrupt is coming the program use the priority? Then I go to another routine, for a short time, but what append with the I2C transmission ?

3) How is it possible to calculate the time used to send one (or more) characters with RS232 (9600 bauds for exemple) ? I used HW transmission with this Pic. I think that the RDA interrupt is not desabled when he send.

Thanks to give me more info about those general problems.
Guest








Re: Questions about interrupts.
PostPosted: Mon Mar 08, 2004 9:49 am     Reply with quote

willie.ar wrote:
Ttelmah, you gave an excellent description of how interrupts work but I can not understand why you say
"The time from the interrupt 'event', to actually arriving in the corresponding handler, will typically be perhaps 80 to 100 clock cycles". I'm experiencing something like that and I'm wondering why. I need exact delays.

Shocked

can you explain a little bit more about that?
tks


GDetienne wrote:
I use a 16F876 with one 16 Mhz quartz.
My program use 3 Interrupts : B0, timer0 and RS232 (or I2C).
In the interrupts routines of B0 (one interrupt each second) and timer0 (one interrupt each mSec), I increment a counter and set a flag. The rest of the code is in the main program.
For RS232, I read the first character and set a flag. In the main program I desable this interrupt and read the rest of characters. For I2C I use the interrupt to read all the data.

I have questions:

1) What append when one interrupt is comming when the program run in another interrupt ? Is the second interrupt treated after the first one ? I think that when he start one interrupt, this one is desabled, but what about the global interrupt ?

2) What append if I use I2C (in this case the program read each charater with interrupt) and another interrupt is coming ? If a second interrupt is coming the program use the priority? Then I go to another routine, for a short time, but what append with the I2C transmission ?

3) How is it possible to calculate the time used to send one (or more) characters with RS232 (9600 bauds for exemple) ? I used HW transmission with this Pic. I think that the RDA interrupt is not desabled when he send.

Thanks to give me more info about those general problems.

'Exact' figures will be _very_ hard.
The interrupt itself, has a 'latency', which is the time from the event, to the interrupt flag being set. On an asynchronous source, this time will vary according to the relation between the event, and the processors clock. Hence the difficulty in 'exact' figures. The delay at this point, will be the time to the next Q1 state of the processor.
Once the interrupt flag is set, assuming that the interrupts are enabled at this point, the processor will jump to the global interrupt handler routine. Inside the routine, the code then saves the necessary registers. The number of registers involved depends on the chip, but is typically perhaps fifteen registers on the 18 chips, and about a ten on the 16 chips. The 18 chips usually have about 27 instructions in the global handler to this point.
Then the code starts to check each interrupt source in turn, for being 'enabled', and to see if it is triggered. This takes another four instruction times for each source.
If the interrupt if found to be both enabled, and triggered, the code then jumps to the handler. At this point (for the first interrupt), there will have been about 33 instruction times since the actual 'trigger'. This is 132 clock cycles. The 16 chips save slightly less registers, and have a little less latency to this point. On these, the figure is more like 27 instruction times for the first interrupt (108 clock cycles).
Then when the interrupt handler is finished, there is a return to the global handler, which then restores the same registers, before returning to the caller, and re-enabling the global interrupt at the same time.
The only way to get an 'accurate' time, will be to sit down and count the instructions on the interrupt handler for your code, and add the hardware response delay from the data sheet as well.

1) Individual interrupts are _not_ disabled in the interrupt handler. It is the global interrupt that gets disabled as soon as the chip responds to an interrupt, and re-enabled when the global handler is left. In the meantime, if an interrupt has occured, it's interrupt flag will be set, so when the interrupts are re-enabled, the global handler will be called again on the next Q1 edge.
When the interrupt handler is called, the interrupt bits themselves are 'polled' in the handler, in the order of the #priority setting. Hence if there are repeated interrupts at a higher priority, a low priority handler will never get called.
2) This is down to dealing with the latency, and relying on buffering. The 'worst case', is when the global handler has just been called for a higher priority event, when a second interrupt takes place. If this happens, the processor goes through the entire sequence of saving registers, calling the higher priority handler, running it's code, restoring the registers, exiting back to the main code, calling the global handler again, saving the registers again, then calling the handler for the second event. Look at time delays in the order of perhaps 3 to 400 clock cycles, in this situation. However when you think that at (say) 20MHz, a byte on the I2C bus, at 100KHz, takes perhaps 1/12000th second to send, and the delay for the handler in this worst case is about the same, the chips buffer, allows this to be acceptable, _provided all interrupt handlers are kept as short as possible_ (hence this mantra...).
3) I allready answered this before.
the length of a character (in bits), is start_bit, + word length, + parity bit, + stop bit. So for '8 bit, no parity, 1 stop', the character is 10 bit times long. 9600bps, implies each bit takes 1/9600th second. Hence a character takes 10/9600th second.

Best Wishes
Guest








Re: Questions about interrupts.
PostPosted: Wed Mar 10, 2004 6:34 am     Reply with quote

Anonymous wrote:
willie.ar wrote:
Ttelmah, you gave an excellent description of how interrupts work but I can not understand why you say
"The time from the interrupt 'event', to actually arriving in the corresponding handler, will typically be perhaps 80 to 100 clock cycles". I'm experiencing something like that and I'm wondering why. I need exact delays.

Shocked

can you explain a little bit more about that?
tks


GDetienne wrote:
I use a 16F876 with one 16 Mhz quartz.
My program use 3 Interrupts : B0, timer0 and RS232 (or I2C).
In the interrupts routines of B0 (one interrupt each second) and timer0 (one interrupt each mSec), I increment a counter and set a flag. The rest of the code is in the main program.
For RS232, I read the first character and set a flag. In the main program I desable this interrupt and read the rest of characters. For I2C I use the interrupt to read all the data.

I have questions:

1) What append when one interrupt is comming when the program run in another interrupt ? Is the second interrupt treated after the first one ? I think that when he start one interrupt, this one is desabled, but what about the global interrupt ?

2) What append if I use I2C (in this case the program read each charater with interrupt) and another interrupt is coming ? If a second interrupt is coming the program use the priority? Then I go to another routine, for a short time, but what append with the I2C transmission ?

3) How is it possible to calculate the time used to send one (or more) characters with RS232 (9600 bauds for exemple) ? I used HW transmission with this Pic. I think that the RDA interrupt is not desabled when he send.

Thanks to give me more info about those general problems.

'Exact' figures will be _very_ hard.
The interrupt itself, has a 'latency', which is the time from the event, to the interrupt flag being set. On an asynchronous source, this time will vary according to the relation between the event, and the processors clock. Hence the difficulty in 'exact' figures. The delay at this point, will be the time to the next Q1 state of the processor.
Once the interrupt flag is set, assuming that the interrupts are enabled at this point, the processor will jump to the global interrupt handler routine. Inside the routine, the code then saves the necessary registers. The number of registers involved depends on the chip, but is typically perhaps fifteen registers on the 18 chips, and about a ten on the 16 chips. The 18 chips usually have about 27 instructions in the global handler to this point.
Then the code starts to check each interrupt source in turn, for being 'enabled', and to see if it is triggered. This takes another four instruction times for each source.
If the interrupt if found to be both enabled, and triggered, the code then jumps to the handler. At this point (for the first interrupt), there will have been about 33 instruction times since the actual 'trigger'. This is 132 clock cycles. The 16 chips save slightly less registers, and have a little less latency to this point. On these, the figure is more like 27 instruction times for the first interrupt (108 clock cycles).
Then when the interrupt handler is finished, there is a return to the global handler, which then restores the same registers, before returning to the caller, and re-enabling the global interrupt at the same time.
The only way to get an 'accurate' time, will be to sit down and count the instructions on the interrupt handler for your code, and add the hardware response delay from the data sheet as well.

1) Individual interrupts are _not_ disabled in the interrupt handler. It is the global interrupt that gets disabled as soon as the chip responds to an interrupt, and re-enabled when the global handler is left. In the meantime, if an interrupt has occured, it's interrupt flag will be set, so when the interrupts are re-enabled, the global handler will be called again on the next Q1 edge.
When the interrupt handler is called, the interrupt bits themselves are 'polled' in the handler, in the order of the #priority setting. Hence if there are repeated interrupts at a higher priority, a low priority handler will never get called.
2) This is down to dealing with the latency, and relying on buffering. The 'worst case', is when the global handler has just been called for a higher priority event, when a second interrupt takes place. If this happens, the processor goes through the entire sequence of saving registers, calling the higher priority handler, running it's code, restoring the registers, exiting back to the main code, calling the global handler again, saving the registers again, then calling the handler for the second event. Look at time delays in the order of perhaps 3 to 400 clock cycles, in this situation. However when you think that at (say) 20MHz, a byte on the I2C bus, at 100KHz, takes perhaps 1/12000th second to send, and the delay for the handler in this worst case is about the same, the chips buffer, allows this to be acceptable, _provided all interrupt handlers are kept as short as possible_ (hence this mantra...).
3) I allready answered this before.
the length of a character (in bits), is start_bit, + word length, + parity bit, + stop bit. So for '8 bit, no parity, 1 stop', the character is 10 bit times long. 9600bps, implies each bit takes 1/9600th second. Hence a character takes 10/9600th second.

Best Wishes



thanks a lot for your response
Willie.ar
Display posts from previous:   
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion All times are GMT - 6 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group