Site Home Archive Home FAQ Home How to search the Archive How to Navigate the Archive
Compare FPGA features and resources
Threads starting:
Authors:A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
>> This leads to a tradeoff between how many bits you use in the >> hardware vs. how closely the real hardware >> filter matches your desired response. > My input bits width is 16 > coefficient bit width is 16 > output width is 40 Use the fixed_pkg VHDL package....it has all you need, to whatever arbitrary precision you may require. KJArticle: 133151
Greetings All, I've just spent 20 mins editing 12 VHDL files to add two signals and route them up and down a design hierarchy. Tedious and not exactly rocket science. Is anyone aware of any refactoring tools out there to automate such processes? Regards, ChrisArticle: 133152
On Wed, 18 Jun 2008 07:03:58 -0700 (PDT), Kolja Sulimma <ksulimma@googlemail.com> wrote: >On 18 Jun., 14:48, Rob <BertyBoos...@googlemail.com> wrote: >> This is exactly the problem, i'd rather not give certain registers >> init values, especially in the data path, as firstly it is unnecessary >> for the actual hardware implementation >But you can't prevent it in a modern FPGA. Why should you explicitely >build a model that does not reflect the actual behaviour of the >hardware? > >> and secondly because in >> simulation seeing a vector of 'X's indicates that this bus hasn't yet >> been assigned a proper value. The problem comes when that data bus is >> driving a process that says something like "if data_bus = 0 then...". >> I can usually avoid this with clock enables, but sometimes it isn't >> convenient to have a clock enable signal with the data. > >So you are replacing the free initialization of the registers by extra >clock enable hardware for what reason exactly? Explicit return to an initial state without the penalty of reconfiguration. Hopefully the same initial state: always using reset will ensure that; otherwise the post-config (signal initialization) and post-reset states could be different... (There may be cases where initialization alone is good enough) - BrianArticle: 133153
On Jun 19, 7:32 am, christopher.saun...@durham.ac.uk (c d saunter) wrote: > Greetings All, > > I've just spent 20 mins editing 12 VHDL files to add two signals and route > them up and down a design hierarchy. > > Tedious and not exactly rocket science. > > Is anyone aware of any refactoring tools out there to automate such > processes? > > Regards, > Chris Well this method is not perfect, but it can help: Declare a record type (either one and make all signals inout, or three for in, out, inout) in a package and use that record type on the ports. Then you can add/delete elements to the record by editing the package. This typically works best for a bus or group of related signals that typically go anywhere as a group. The entities may have multiple record ports if they attach to multiple buses or signal groups. It would be REALLY NICE if VHDL had user defined modes for record types such that different elements of a record type port could have different modes. Those user defined modes could then be used in a port specification with a port of that type. You could define multiple modes for the same type (i.e. master, slave for a bus). Besides that, I strongly recommend not using components/configurations if you do not need the configurability. Directly instantiate the entity/architectures, and then you don't have any components or configurations to keep up to date. AndyArticle: 133154
On Wed, 18 Jun 2008 10:10:24 -0700 (PDT), rickman <gnuarm@gmail.com> wrote: >On Jun 18, 10:04 am, Brian Drummond <brian_drumm...@btconnect.com> >wrote: >> On Wed, 18 Jun 2008 04:57:04 -0700 (PDT), Kolja Sulimma >> >> <ksuli...@googlemail.com> wrote: >> >In many ASIC libraries and some older FPGAs the initial values of >> >registers are random. 'X' is the >> >perfect representation for that. >> I completely agree with this : you want to see that any initial "X"es >> are cleared to valid values in a few cycles after reset, BY THE LOGIC >> ITSELF and not a spurious "clean" function. > >There are applications that will work correctly in hardware without an >initial condition, but never reach a valid state in simulation without >it. In those cases I find it easier to add an initialization to the >hardware to make the simulation match the hardware. >An xxample is a free running divide by 2 clock generator. It will >start up in a 1 or a 0 state. It does not matter. Good point. I would suggest these are the exception rather than the rule, and need some special treatment by the designer. Initialization is an unnecessary constraint on the hardware but IMO probably the best compromise. Another way is to use a "to_01" function on that specific signal to arbitrarily resolve the unknown level if you don't want to initialize (e.g. if you need a portable design, and not all tools/technologies support the initialization). >> ... but wouldn't it be nice to >> tell the simulator to suppress these errors for the first XXX ns, then >> report them. >> >> I expect a simple TCL script would do it... >> >> suppress Numeric_Std invalid value warnings; >> run 250 ns; >> enable Numeric_Std invalid value warnings; >> run; >> >> but haven't taken the time to learn enough TCL to try >> (my testbenches are entirely VHDL and that keeps me busy enough!). > >Is it important to not use an initial condition controlled by a global >reset? In FPGAs this is typically free, or almost free to use it >properly. In ASICs I guess it is a different matter. Not sure I understand the question. If you mean use the global reset signal to suppress warnings; it won't solve the whole problem. It'll suppress warnings while Reset is active, but I need to suppress warnings for a couple of dozen cycles after Reset. Resetting pipeline registers is unnecessary and inefficient (it prevents packing into SRL16s for example) so "XXXX" is to be expected for a small but deterministic time after reset while the pipelines flush. - BrianArticle: 133155
On Wed, 18 Jun 2008 07:45:10 -0700 (PDT), faza <fazulu.vlsi@gmail.com> wrote: >If i use fixed to integer conversion using left shift operation...I am >not sure i may be thrown with compilation error as the maximum long >int i can is 2^32 only.. > >so if i have such a long 0.99999999999999999998888888888888777777777 > >iit is impossible to have such a big int converted value.. Code that as 1 - small number instead. - BrianArticle: 133156
On Jun 19, 9:01 am, Brian Drummond <brian_drumm...@btconnect.com> wrote: > On Wed, 18 Jun 2008 10:10:24 -0700 (PDT), rickman <gnu...@gmail.com> > wrote: > > >Is it important to not use an initial condition controlled by a global > >reset? In FPGAs this is typically free, or almost free to use it > >properly. In ASICs I guess it is a different matter. > > Not sure I understand the question. If you mean use the global reset > signal to suppress warnings; it won't solve the whole problem. > > It'll suppress warnings while Reset is active, but I need to suppress > warnings for a couple of dozen cycles after Reset. Resetting pipeline > registers is unnecessary and inefficient (it prevents packing into > SRL16s for example) so "XXXX" is to be expected for a small but > deterministic time after reset while the pipelines flush. I am asking if there is a reason to *not* use the GSR to control the initial state of the FFs. I am not sure I understand about the SRLs. They are implemented using the configuration memory in the LUT. That is definitely loaded on power up. I guess it is not set by the GSR, but it is well defined even if not controllable from your code. Do you need a way to control the values of SRL elements other than on configuration? If the SRL is in a determined state on configuration, but this state can not be restored by user reset of any sort, how does Xilinx recommend that you model this? I assume there is some logic in front of the SRL, but that is not certain I expect. Either way, each SRL only has a single input and you can use another LUT to provide a "clamp" to the input or output to prevent the spread of the 'X' state. I know you don't want to use extra resources, but personally, I find that preferable to fighting the tools (like I am doing now). RickArticle: 133157
On Jun 19, 1:20 am, hal-use...@ip-64-139-1-69.sjc.megapath.net (Hal Murray) wrote: > In article <a068485f-0e9d-4a93-beb6-cd825ada1...@j22g2000hsf.googlegroups.com>, > > rickman <gnu...@gmail.com> writes: > >Anyone know where this configuration is stored? How do I turn it off > >permanently? > > Permanently? :) > > You are a hardware geek. Right? Well, I am a *geek*, I don't know that I would constrain that to hardware... > The classical solution is wire cutters at the speaker terminals. Yes, that is indeed permanent... but not very selective. There are a few sounds that come from the PC that I don't object to and even a few that I like. I would hate to lose those. I was thinking more on the lines of a way to make the option change in Simplify more permanent. The ironic part is I think that I turned it on in the first place. I thought it might be a good idea. I just didn't know how annoying this particular beep could be! RickArticle: 133158
On Thu, 19 Jun 2008 12:04:16 +0100, Jonathan Bromley <jonathan.bromley@MYCOMPANY.com> wrote: >On Thu, 19 Jun 2008 10:37:14 +0100, "Symon" wrote: > >>Where have I seen that before? >>Ah yes, http://en.wikipedia.org/wiki/Transputer > >And now, for bonus points, put an [X] against any >of the following reasons if you think it helps to >explain why the Transputer was a market flop... > >[ ] it was at least two decades ahead of its time >[ ] it was about one decade ahead of the technology > needed to make it powerful enough to be competitive >[X] it was based on robust, sound academic theory >[X] it had a clean, effective, readable programming > language that was not C >[X] it was a British design worse than that: it was a Government funded design under a government that didn't fund technology... >[ ] the software community was even more clueless about > how to make use of multiple scalable compute resources > in the late 1970s than it is today Definitely not this one. as well as O***m, they came up with Ada, where you can: create a task type; create an array of tasks of that type; loop over the array starting the tasks; loop over the array collecting the results; in pretty much no more code than the above. Nowadays, C++ has given the SW community ONE positive thing: the old argument that Ada is too large and complex to be usable no longer holds water... But count the hand-wringing woe-is-me stories about the difficulty of parallelism and the massive new collaborations to solve the problem on one hand, and the mentions of Ada (or O***m) on the other... >[X] when presented with a model that permits parallel > processing to be spectacularly efficient, the > software community retreats into its standard cosy > set of assumptions that serial execution is sure > to be faster and more efficient, and therefore > Concurrent Is Bad, mainly because it isn't C > >For maximum credit, write a 10,000 word dissertation >explaining why the social dynamics of the software community >will ensure the early death of anything that looks like >a general-purpose, tightly-coupled network of compute nodes. > >Not that I'm bitter, or anything like that :-) The funny thing about the transputer (from the VERY little I did on it at the time; paper only since I couldn't afford one as a hobbyist) is that there was a time window when it appeared to be the fastest single processor available, (ignoring 3-chip solutions) even without its hooks for parallelism... but that got overshadowed by the parallelism. If it had had an MMU, it could have made a nice Unix workstation (or single-chip Lilith, or...) - BrianArticle: 133159
On Wed, 18 Jun 2008 23:38:42 -0700, Muzaffer Kal <kal@dspia.com> wrote: >On Wed, 18 Jun 2008 18:01:59 -0700, "SynopsysFPGAexpress" ><fpgas@sss.com> wrote: > >>As commodity PC hardware and prouctivity applications deline in price, EDA >>tools are as (relatively) expensive as ever, necessitating yet another >>discussion of "Which simulator is right for me?" >> >>The contendors are ... > >Don't forget http://fintronic.com/home.html and >http://simucad.com/products/verilogSimulation/silos-x.html > >I personally like Finsim (from Fintronic) a lot. It's a compiled >simulator and it's quite fast. Both appear to fail the OP's requirements (and mine); they look to be Verilog only. - BrianArticle: 133160
"rickman" <gnuarm@gmail.com> wrote in message news:ea4b9ffd-5b67-45d5-afc3-a47bfcb07fbe@x35g2000hsb.googlegroups.com... > On Jun 19, 1:20 am, hal-use...@ip-64-139-1-69.sjc.megapath.net (Hal > Murray) wrote: >> In article >> <a068485f-0e9d-4a93-beb6-cd825ada1...@j22g2000hsf.googlegroups.com>, >> >> rickman <gnu...@gmail.com> writes: >> >Anyone know where this configuration is stored? How do I turn it off >> >permanently? >> >> Permanently? :) >> >> You are a hardware geek. Right? > > Well, I am a *geek*, I don't know that I would constrain that to > hardware... > > >> The classical solution is wire cutters at the speaker terminals. > > Yes, that is indeed permanent... but not very selective. There are a > few sounds that come from the PC that I don't object to and even a few > that I like. I would hate to lose those. > > I was thinking more on the lines of a way to make the option change in > Simplify more permanent. > > The ironic part is I think that I turned it on in the first place. I > thought it might be a good idea. I just didn't know how annoying this > particular beep could be! > > > Rick Hello Rick, Does the fact that you are now down to worrying about the beeps mean that you are a bit happier with the Lattice software now? I ask because I've been using it a while (2 years) and find it pretty good (compared with X which i used before). I never had you issues with Aldec v ModelSim because I have had fully paid up separated Aldec HDL for ages. Michael KellettArticle: 133161
faza wrote: <snip> > > My input bits width is 16 > coefficient bit width is 16 > output width is 40 > > regards, > faza > > > > > > > On Jun 19, 9:20 am, Jeff Cunningham <j...@sover.net> wrote: >> faza wrote: >>> If i use fixed to integer conversion using left shift operation...I am >>> not sure i may be thrown with compilation error as the maximum long >>> int i can is 2^32 only.. >>> so if i have such a long 0.99999999999999999998888888888888777777777 >>> iit is impossible to have such a big int converted value.. <snip> You continue to demonstrate an extremely weak grasp on simple concepts. While I still encourage you to talk to your professors and *choose another pursuit* for your final year project, I'll point out that your declaration that these tiny 32 bit integers are too small to accommodate your numbers like 0.99999999999999999998888888888888777777777, you state you have 16 bit input widths and 16 bit coefficients. That number is so much larger than 16 bits it's silly. FIRs need limited size for their operands. You seem to now specify reasonable constraints on your coefficient and data but do you know why those values are chosen or did your professor or Matlab (or paperboy) tell you you needed these realistic values? Because precision isn't free, all digital filtering has "quantization effects" which can affect the filter characteristics and/or noise generated by your filter, even stability in IIR filters. You need to know many characteristics of the filter you're trying to implement to properly size an FIR in taps, input width, and coefficient width. It's incomplete to declare a "cutoff frequency" and think you are ready to design. Do you know your required passband ripple? Stopband attenuation? Steepness of your cutoff? You need to 1) research more what digital filter requirements are needed for your FIR implementation, 2) understand various ways digital filters are implemented (or at least research the FIR), and 3) learn how to do hardware design with FPGAs. This last item will set you back months by my guess. PLEASE choose another project with which you can succeed.Article: 133162
Hi, i have generated a user ip with dma/sg support. now i want to disable the dma burst transfer. in the vhdl source comment it says: -- specify the size (must be power of 2) of burst that dma uses to -- tranfer data on the bus, a value of one causes dma to use single -- transactions (burst disabled). so i changed: constant DMA_BURST_SIZE : integer := 16; to constant DMA_BURST_SIZE : integer := 1; but when i now synthesize my design, i get the following error: ERROR:Xst:774 - "/home/fatmike/EDK/hw/XilinxProcessorIPLib/pcores/ ipif_common_v1_00_d/hdl/vhdl/dma_sg_sim.vhd" line 524: Constant must have a value : 's'. What is wrong here and how can i disable burst mode? I am using Xilinx EDK 9.1i. Thank you in advance, SebastianArticle: 133163
On Jun 18, 11:38 pm, Muzaffer Kal <k...@dspia.com> wrote: > On Wed, 18 Jun 2008 18:01:59 -0700, "SynopsysFPGAexpress" > > <fp...@sss.com> wrote: > >As commodity PC hardware and prouctivity applications deline in price, EDA > >tools are as (relatively) expensive as ever, necessitating yet another > >discussion of "Which simulator is right for me?" > > >The contendors are ... > > Don't forgethttp://fintronic.com/home.htmlandhttp://simucad.com/products/verilogSimulation/silos-x.html > > I personally like Finsim (from Fintronic) a lot. It's a compiled > simulator and it's quite fast. I've been using Veritak, a low cost Veritak simulator. It has had some bugs, but the author is VERY quick to fix problems. A surprise - he is also VERY responsive to requests for new features or enhancements. I'm very pleased with his product. John ProvidenzaArticle: 133164
On Jun 19, 9:43 am, "MK" <nos...@please.com> wrote: > "rickman" <gnu...@gmail.com> wrote in message > > news:ea4b9ffd-5b67-45d5-afc3-a47bfcb07fbe@x35g2000hsb.googlegroups.com... > > > > > On Jun 19, 1:20 am, hal-use...@ip-64-139-1-69.sjc.megapath.net (Hal > > Murray) wrote: > >> In article > >> <a068485f-0e9d-4a93-beb6-cd825ada1...@j22g2000hsf.googlegroups.com>, > > >> rickman <gnu...@gmail.com> writes: > >> >Anyone know where this configuration is stored? How do I turn it off > >> >permanently? > > >> Permanently? :) > > >> You are a hardware geek. Right? > > > Well, I am a *geek*, I don't know that I would constrain that to > > hardware... > > >> The classical solution is wire cutters at the speaker terminals. > > > Yes, that is indeed permanent... but not very selective. There are a > > few sounds that come from the PC that I don't object to and even a few > > that I like. I would hate to lose those. > > > I was thinking more on the lines of a way to make the option change in > > Simplify more permanent. > > > The ironic part is I think that I turned it on in the first place. I > > thought it might be a good idea. I just didn't know how annoying this > > particular beep could be! > > > Rick > > Hello Rick, > > Does the fact that you are now down to worrying about the beeps mean that > you are a bit happier with the Lattice software now? > I ask because I've been using it a while (2 years) and find it pretty good > (compared with X which i used before). > I never had you issues with Aldec v ModelSim because I have had fully paid > up separated Aldec HDL for ages. No, in fact I am having an issue with using the GSR right now. The tools have a Design Planner which allows you to edit pin assignments and port settings. But for some reason the tool has stopped working correctly. I can open the spreadsheet view, but I don't have a Port Assignments tab. I have a Pin Assignments tab, but this won't let me edit anything. I am actually reaching that point where every little thing is driving me nuts! So the beeping is becoming more and more annoying... it's almost like it is *mocking* me... beep, that won't work... beep, that won't work either... beep, beep, beep!!! ;^) The local FAE is pretty good and has been trying to support me. But there is only so much that can be done by email or over the phone. I have a work around for the GSR issue, but I still want to get the Design Planner working correctly since otherwise all changes to the port I/Os has to be done by hand editing the LPF file and I can't find docs on the command formats and values. It's times like these that I wish I had kids to yell at! RickArticle: 133165
On Jun 19, 9:43 am, "MK" <nos...@please.com> wrote: > > Hello Rick, > > Does the fact that you are now down to worrying about the beeps mean that > you are a bit happier with the Lattice software now? > I ask because I've been using it a while (2 years) and find it pretty good > (compared with X which i used before). > I never had you issues with Aldec v ModelSim because I have had fully paid > up separated Aldec HDL for ages. I did want to say one other thing. My original approach to this design was to use the free tools for synthesis and the free X tools for simulation. But X has gone with an in house simulator and it is pretty bad! It seems to take a long time to compile the design, longer than most of the simulations I run... at least when I am debugging the details. I don't recall the specifics, but the UI has some significant limitations and I want to say it has a few bugs, but I'm not sure. The point is that it was bad enough that I decided to shell out nearly $1k for the Lattice tools that included a simulator. The Aldec tool is fine, in fact, it may be better than Modelsim. It is fast to compile and even the speed limited version seems to simulate reasonably fast. But the rest of the Lattice tools have given me a lot of trouble and mostly the licensing has been very painful. I am still running two versions of ispLever, 7.0 for the synthesis and 7.1 for simulation. They won't give me a license for the simulator I purchased with 7.0 and they can't get the synthesis to work under 7.1 on my machine. Does any of this sound like I am "happier"? RickArticle: 133166
Mike, Thank you. I have forwarded this off the the folks responsible. It is important that we get the documentation as accurate as possible. I think the original post concerned itself with "we don't document what we don't have" which is a more generic problem. I have two blogs on the subject in the mill for the Xilinx blog spot: http://forums.xilinx.com/xlnx/blog?blog.id=pld So, stay tuned. Ken's latest blog is pretty interesting, too. AustinArticle: 133167
rickman wrote: > I am running Synplify from the Lattice ispLever tools and every time I > compile the design Synplify beeps when it is started and beeps when it > finishes. It is a lot louder than the other sounds the computer makes > and is a very irritating noise. They have a control to turn it off, > but I can't figure out how to make the control stay off from one run > to the next. It seems to default to beeping. > > Anyone know where this configuration is stored? How do I turn it off > permanently? > > Rick Rick, My tool liscensed so I'm not sure if it's the same as what you're working with. My ispLever Project Manager has the ability to launch "Synplify Pro Synthesis" from the Tools menu. Once in Synplify Pro, the Options menu "Project View Options..." has "Beep when a job completes" checkbox on the second line. This is SynplifyPro for Lattice 9.4L launched from the 7.1 Lattice tool suite. - John_HArticle: 133168
Hi, I am trying to access a QuickLogic FPGA via JTAG. I am using a FTDI DLP2232M-G (and associated FTCJTAG api) to connect to the device. So far I have been able to read and write data to the device sucessfully, or at least i think so! My problem is, that presented with a read / write API functions I have no idea as to what I *should* be writing to or reading from the FPGA and to which registers (instruction / test). My goal (at least for now) is to sample the state of every pin on the device. My question is this: does anyone know the commands needed to sample the state of every pin on the FPGA and if so how do I interpret what I read back? Please bear in mind that I am a software engineer and know virtually nothing about FPGA's and very little about JTAG (dispite reading the IEEE standard and understanding very little). Thanks in advance for any help. _TK_Article: 133169
rickman wrote: > I am asking if there is a reason to *not* use the GSR to control the > initial state of the FFs. I am not sure I understand about the SRLs. > They are implemented using the configuration memory in the LUT. That > is definitely loaded on power up. I guess it is not set by the GSR, > but it is well defined even if not controllable from your code. Do > you need a way to control the values of SRL elements other than on > configuration? I don't use the GSR or SRL because they are non-portable and fussy. I save time by using an explicit reset of every register. > I know you don't want to use > extra resources, but personally, I find that preferable to fighting > the tools (like I am doing now). I agree with you there. I prefer saving time to saving some extra unused flops. -- Mike TreselerArticle: 133170
SynopsysFPGAexpress wrote: > As commodity PC hardware and prouctivity applications deline in price, EDA > tools are as (relatively) expensive as ever, necessitating yet another > discussion of "Which simulator is right for me?" > > The contendors are ... ... > If you have Xilinx ISE 10.1, check out ISIM, which comes "free" with it. It's much improved and may meet your needs and in future releases should have better a better user interface. I don't think it currently supports SystemVerilog, (and you are correct in propagating your religion) but might soon. Modelsim is still the best, but you pay for a lot of things you don't really need, and the waveform viewer could be improved. That's where you spend 90% of your time during debugging so it should be a little easier to use. -KevinArticle: 133171
On Jun 18, 5:39 pm, Jim Granville <no.s...@designtools.maps.co.nz> wrote: > Next, will we see boards with GPU and FPGA ? > > -jg What is missing is an embedded solution. To my knowledge, there is no compact embedded system using the latest Nvidia GPUs. We can't fit a PC (even a smallish one) inside our instrument. Even if we could, heat dissipation would be a problem. So I guess that FPGAs will still reign for a while in high performance embedded applications. PatrickArticle: 133172
In article <g3e8sk$ssv1@cnn.xsj.xilinx.com>, Kevin Neilson <kevin_neilson@removethiscomcast.net> wrote: >... Modelsim is still the best, but you pay for a >lot of things you don't really need, and the waveform viewer could be >improved. That's where you spend 90% of your time during debugging so >it should be a little easier to use. I've used vcs and currently have access to ncsim, so I've not bothered with modelsim. However, recently I tried the free one which comes with Altera web edition. So in either of these, you typically simulate and have all signals dumped to a huge output file (using $dumpvars(0); $dumpon; for vcs or $shm_open(...); $shm_probe(..); for ncsim). Then you can explore the design hierarchy and choose which signals to view in vcs -RPP or simvision. The same is true for even icarus verilog with gtkwave. However with modelsim it looks like there is no way to do this. Instead, when you add a signal to the viewer in the GUI, it re-runs the entire simulation to get the new signal. Am I missing something or is this really how it works? I can't believe that it would really work this way. (Also the crippled free modelsim is slower than icarus). Has anyone tried to simulate Altera's IP with icarus? I'm thinking about the .vo simulation model of the altmemphy DDR controller. -- /* jhallen@world.std.com AB1GO */ /* Joseph H. Allen */ int a[1817];main(z,p,q,r){for(p=80;q+p-80;p-=2*a[p])for(z=9;z--;)q=3&(r=time(0) +r*57)/7,q=q?q-1?q-2?1-p%79?-1:0:p%79-77?1:0:p<1659?79:0:p>158?-79:0,q?!a[p+q*2 ]?a[p+=a[p+=q]=q]=q:0:0;for(;q++-1817;)printf(q%79?"%c":"%c\n"," #"[!a[q-1]]);}Article: 133173
On Jun 18, 3:23 pm, John_H <newsgr...@johnhandwork.com> wrote: > Beantown wrote: > > > Hi John, > > > The tool reports that there could be excessive skew on the clk > > signal. I also have this signal mapped to an IO pin. Maybe that is > > what the tool is complaining about? > > > Thanks for the suggestion of using the FPGA Editor. I will take a > > look to see what is actually going on. > > You probably have it figured out. What I like to do to get clocks to > the outputs is use the IOB DDR flops with opposite DC values on the > the D inputs for the two phases. I get a clock that stays 100% global > and is better correlated to output data generated by that clock. > > - John Hi John, Thanks for the suggestion for outputting the clock using the DDR flops. I'll give it a try. BTArticle: 133174
On Thu, 19 Jun 2008 19:27:10 +0000 (UTC) jhallen@TheWorld.com (Joseph H Allen) wrote: > However with modelsim it looks like there is no way to do this. > Instead, when you add a signal to the viewer in the GUI, it re-runs > the entire simulation to get the new signal. Am I missing something > or is this really how it works? I can't believe that it would really > work this way. Invoke vsim with -do "log -r *; run -all; quit -f" and -wlf "mydump.wlf", and you'll get similar results (just in a different format). In my experience ncsim is faster than Modelsim, and of course it carries a higher price tag. Older versions of Modelsim also seems to stall after long simulations, wehreas ncsim never gave me any problems. -- Faster, faster, you fool, you fool! -- Bill Cosby
Site Home Archive Home FAQ Home How to search the Archive How to Navigate the Archive
Compare FPGA features and resources
Threads starting:
Authors:A B C D E F G H I J K L M N O P Q R S T U V W X Y Z