I used c# timer but it's minimum value is 1ms ,
I need timer range in microseconds

I doubt you need a Microsecond timer. That barely gives whatever logic you call enough time to initialize. What are you trying to do?

first thank you for replay

then I want to send data through parallel port every 10 micro seconds

Well... I guess you could have a valid reason for wanting to do it :P. However it probably wasn't included because .NET is intended for application development and not so much realtime operations like this. You can time events every 0.5ms or however you want but I think your next timer will frequently fire off the event before the last call finished up. You will need to do a considerable amount of testing on this. Also you will want to test without the debugger attached and with code optimizations on because the added overhead may very well delay execution. Take a look at the StopWatch class. I think that is the most-accurate way of controlling timing included in the .NET framework.

The Timer.Interval property is defined as a double, so if you want 10 microsecs you could try Timer.Interval = 0.01;
But as sknake pointed out, I also doubt if it works, should say give it a try and succes!

i do it but error message appear say ''property value is not valid
(.01 is not a valid value for Int32.)''

how can i solve it?

please if you have an idea to solve microseconds problem tell me


thank you

Sorry I'm out of resources here...

I'm going to break protocol here somewhat, perhaps (since the question has already been begged), and ask what kind of scenario you are wanting to manipulate with a timer in microseconds? My teeny-macro mind cannot fathom what it could be...:icon_redface: Any real-world explanation would suffice for me if you cannot divulge the actual purpose.

I will use it as clock signal to control hardware

Could i be you need more something like this? http://en.wikipedia.org/wiki/555_timer_IC
You could set this on and off via C#.

Could i be you need more something like this? http://en.wikipedia.org/wiki/555_timer_IC
You could set this on and off via C#.

Now that makes sense to me because I would think you would use a dedicated circuit/processor for that kind of speed. I had no idea those 555 timers are still so popular. I have built some fun stuff with them way-back-when (decibel meter, strobing LED's, binary clock).;)

Hey! Way back I made a metronome with it! Very proud when after some soldering it actually worked.:icon_biggrin:
Don't know if the 555 is still popular today...

Hey! Way back I made a metronome with it! Very proud when after some soldering it actually worked.:icon_biggrin:
Don't know if the 555 is still popular today...

Most certainly is. Last used one in an egg timer :p But they are still the #1 timing IC around

commented: Glad to know that! +6

i have agood knowledge about 555 timer but i need to make apulse width modulation signal using c#

I suggest your best bet is to program something like a PIC microcontroller. Get it to accept a serial comms message that tells it frequency & mark space ratio. So your C# just sends a message to the dedicated hardware when it wants to change the modulation.

Fundamentally, 10 microseconds is much shorter than a scheduler can run tasks.

But, do you need your PWM rate to be 100,000 Hz? If each pulse is to carry an independent value, you need to export a complete message to a microcontroller - and then you'll need to be double buffering to keep up with the required work rate. If the PWM stream represents a realatively slowly varying signal level, does the repeat rate need to be so high?

commented: the microsecond master has arrived. great first post! +6

You can set a timer for milliseconds in C# -> Thread.Sleep(new TimeSpan(10)); This TimeSpan ctor takes a long in 100 nanosecond intervals. - SB

Sorry, get the oscilloscope out. The fact that the syntax allows you to specify a 100 nsec interval doesn't prove you can deliver that to the outside world. You have to allow for scheduling overhead. In order to beat the other threads to a particular mill, you've got to set thread priority to make sure you're not the one that's being pre-empted.

The danger is you end up committing one of the processor cores to this objective, because you can't allow the overhead of a task context switch.

When a thread sleeps, it is de-scheduled, allowing the processor core that's running your thread to unload the working context of that thread, figure out which thead to run next, load it's context, and switch to that. That's going to take more than a microsecond.

You can set a timer for milliseconds in C# -> Thread.Sleep(new TimeSpan(10)); This TimeSpan ctor takes a long in 100 nanosecond intervals. - SB

I unsdestand what you want. I am in the same issue here. I'm using a Measurement Computing PCI DIO-24 card with 24 I/O bits. I comes with a DLL and libraries for .NET. I've been using it to send data to some external FIFOs, but it is too slow with the timer at 1ms. it means that my clock has a period of 2ms at the best which is only 0.5khz. My FIFOs can go 100Mhz! I seems pretty stupid that a Ghz computer can only stopwatch 1ms intervals. --The Bug

Processors these days perform 4 instruction sets for ever tick of the crystal oscillator from the system clock. If anything its not the processor. I would say its a limitation on the .net virtual machine sand boxing all the code it executes. If the OP seriously need accuracy at less than 1ms he either needs to switch to a native language or more realistically create a dedicated circuit in the device that uses a timing crystal of its own.

Just my 2 cents, best of luck with your project.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.