Site Home Archive Home FAQ Home How to search the Archive How to Navigate the Archive
Compare FPGA features and resources
Threads starting:
Authors:A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
For a handheld device, powered by a Li-ion cell, I need a charge power input. Most standard chips for charging and powering the device are meant for USB power or simular. Since the current standard charging connector for mobile phones is a micro USB connector, it seems best to use that. It means you can use a lot of standard chargers. Agreed? For the device, I also need a serial debug connection. It is OK to have the 3V3 UART connection on an internal header. This means the device must be opened for debugging and maybe a slot must be made in the enclosure for longer test sessions with enclosure. But is it possible and/or advisable to use the spare pins on the micro USB connector to bring out the 3V3 UART signals? It would save a header and make debugging in the enclosure a lot simpler. But it must not lead to damage to the device and or (windows/linux/mac) PC when the device is plugged in to a PC. There must not be any strange behaviour on the PC and preferably there must not be any detection by the OS of USB activity. You could ofcourse use a real USB/serial connection, but that would mean adding an FTDI chip or a big software effort on the DSP (which has USB hardware). It would also mean every PC user has acces to the debug channel, I'm not sure we want that. So putting the 3V3 UART signals (TX/RX only) on the micro USB and use a special breakout box (possibly with FTDI chip) for debugging seems the most practical solution for now. Any arguments against it? Any experiences with such a setup? And which signal on which pin for least chance on damage and strange behaviour? I know some USB chargers have shorts or resistors on the datalines, so at least a few protection resistors on the device side are required. -- Stef (remove caps, dashes and .invalid from e-mail address to reply by mail) To the systems programmer, users and applications serve only to provide a test load.Article: 154701
Stef <stef33d@yahooi-n-v-a-l-i-d.com.invalid> wrote: > For a handheld device, powered by a Li-ion cell, I need a charge power > input. Most standard chips for charging and powering the device are > meant for USB power or simular. Since the current standard charging > connector for mobile phones is a micro USB connector, it seems best to > use that. It means you can use a lot of standard chargers. Agreed? > For the device, I also need a serial debug connection. It is OK to have > the 3V3 UART connection on an internal header. This means the device > must be opened for debugging and maybe a slot must be made in the > enclosure for longer test sessions with enclosure. You mean USB and serial at the same time? It seems usual for USB mouse to also work as a PS/2 mouse with the same lines and a connector/adaptor. But it is one or the other. > But is it possible and/or advisable to use the spare pins on the micro > USB connector to bring out the 3V3 UART signals? It would save a header > and make debugging in the enclosure a lot simpler. But it must not > lead to damage to the device and or (windows/linux/mac) PC when the > device is plugged in to a PC. There must not be any strange behaviour on > the PC and preferably there must not be any detection by the OS of USB > activity. I think you should be at least 5V safe. The way to do that with 3V3 lines is well known, especially for slow signals. (I think you only need resistors.) > You could ofcourse use a real USB/serial connection, but that would > mean adding an FTDI chip or a big software effort on the DSP (which has > USB hardware). It would also mean every PC user has acces to the debug > channel, I'm not sure we want that. > So putting the 3V3 UART signals (TX/RX only) on the micro USB and use > a special breakout box (possibly with FTDI chip) for debugging seems the > most practical solution for now. Any arguments against it? > Any experiences with such a setup? And which signal on which pin > for least chance on damage and strange behaviour? I believe it is current limiting resistors, as long as the signals aren't too fast. > I know some USB chargers have shorts or resistors on the > datalines, so at least a few protection > resistors on the device side are required. As I understand it, USB devices requiring more than the default Icc are supposed to negotiate with the host. That is hard to do when the host is just a power adaptor, so there seem to be conventions for the voltages on the data lines that allow devices to pull more current. That is, for example, how iPod chargers work. You might also want to do that. -- glenArticle: 154702
On Thursday, December 20, 2012 9:58:23 AM UTC, o pere o wrote: > On 12/19/2012 11:55 PM, wrote: > > > On Dec 19, 9:32 am, o pere o <m...@somewhere.net> wrote: > > >> My current goal is to implement some digital signal processing (filters) > > >> on a FPGA. I am currently using Terasics DE0 nano board. This board has > > >> an ADC128S022 ADC. I have started as follows: > > >> > > >> From the 50 MHz board reference I derive a 25.6 MHz signal with a PLL. > > >> From this clock I generate the signals required to drive the ADC, > > >> essentially a clock at 3.2 MHz. Every 16 clock cycles, the ADC gives a > > >> 12 bit sample. This translates into 200 ksps. I generate a signal > > >> "smpl_rdy" at the appropriate position which allows me to latch the 8 > > >> most significant bits. > > >> > > >> The main question is how should I do the signal processing: > > >> > > >> a) Using the 25.6 MHz clock and using smpl_rdy as a clock enable > > >> b) Deriving a new 200 kHz clock from the PLL > > >> > > >> I have done some projects on FPGAs but they were quite simple, so I > > >> consider myself only a little more than a beginner. I can think of some > > >> problems with both approaches, but I may have overlooked may others: > > >> > > >> For instance, if I followed a), I guess that Quartus II would think that > > >> the processing happens at 25.6 MHz: if there is a long combinational > > >> path between registers, the timing analyzer will not be able to figure > > >> out that the data and the enable signal are stable during 16 clock > > >> cycles. Is there a way to provide this info to Quartus II? OTOH, using > > >> the same signal as an enable for everything further down does not seem > > >> sound enough, thinking of fanouts. So what? > > >> > > >> If I tried to follow b), how would I ensure that there is the proper > > >> phase relationship between both clocks? Is there a way to achieve this? > > >> > > >> Thanks for any advice. > > >> > > >> Pere > > > > > > Do you need the 25.6MHz or could you do everything synchronous on a > > > 3.2MHz ? > > > Then, even without telling it that you have multi cycles paths, timing > > > should > > > be easy > > > > In this case, I could do everything at a much lower frequency. As I have > > to generate the signals to control the ADC, my approach has been to > > start with a system frequency at least 2x the ADC clock frequency that I > > have to generate. So, I could work with 6.4MHz and timing would be much > > easier. However, the main point of my question was to learn the proper > > way to do this. > > > > > But also consider that for a lower clock rate you might need more > > > resources > > > > > > i.e. filter running at 25.6MHz might only need one mul-acc, where a > > > filter > > > running at 3.2MHz needs 8 > > > > That's certainly true! > > > > BTW, any inputs on whether using my smpl_rdy as an enable for each > > register is a good/bad idea? That is how I would do it for sure in this case. > > > > > -Lasse > > > > > Thanks for your inputs! > > > > PereArticle: 154703
In comp.arch.fpga, Stef <stef33d@yahooI-N-V-A-L-I-D.com.invalid> wrote: > > Sorry , wrong group! I'll re-post in comp.arch.embedded -- Stef (remove caps, dashes and .invalid from e-mail address to reply by mail) For most men life is a search for the proper manila envelope in which to get themselves filed. -- Clifton FadimanArticle: 154704
In comp.arch.fpga, glen herrmannsfeldt <gah@ugcs.caltech.edu> wrote: > Stef <stef33d@yahooi-n-v-a-l-i-d.com.invalid> wrote: > Sorry, I meant to post this to comp.arch.embedded. Re-posted it there, I'll try to cancel it here. If you can, put any replies over there. But I'll keep reading here for any replies. >> For a handheld device, powered by a Li-ion cell, I need a charge power >> input. Most standard chips for charging and powering the device are >> meant for USB power or simular. Since the current standard charging >> connector for mobile phones is a micro USB connector, it seems best to >> use that. It means you can use a lot of standard chargers. Agreed? > >> For the device, I also need a serial debug connection. It is OK to have >> the 3V3 UART connection on an internal header. This means the device >> must be opened for debugging and maybe a slot must be made in the >> enclosure for longer test sessions with enclosure. > > You mean USB and serial at the same time? No, I just need the USB for power. I want to use the 'spare' lines for serial. > It seems usual for USB mouse to also work as a PS/2 mouse with > the same lines and a connector/adaptor. But it is one or the other. > >> But is it possible and/or advisable to use the spare pins on the micro >> USB connector to bring out the 3V3 UART signals? It would save a header >> and make debugging in the enclosure a lot simpler. But it must not >> lead to damage to the device and or (windows/linux/mac) PC when the >> device is plugged in to a PC. There must not be any strange behaviour on >> the PC and preferably there must not be any detection by the OS of USB >> activity. > > I think you should be at least 5V safe. The way to do that with 3V3 > lines is well known, especially for slow signals. (I think you only > need resistors.) Protecting my device is probably the the easiest problem. But how to keep the PC from sensing there is anything on the USB and emiting a 'connect sound' for instance? >> You could ofcourse use a real USB/serial connection, but that would >> mean adding an FTDI chip or a big software effort on the DSP (which has >> USB hardware). It would also mean every PC user has acces to the debug >> channel, I'm not sure we want that. > >> So putting the 3V3 UART signals (TX/RX only) on the micro USB and use >> a special breakout box (possibly with FTDI chip) for debugging seems the >> most practical solution for now. Any arguments against it? >> Any experiences with such a setup? And which signal on which pin >> for least chance on damage and strange behaviour? > > I believe it is current limiting resistors, as long as the > signals aren't too fast. > Guess that will work. >> I know some USB chargers have shorts or resistors on the >> datalines, so at least a few protection >> resistors on the device side are required. > > As I understand it, USB devices requiring more than the default > Icc are supposed to negotiate with the host. That is hard to > do when the host is just a power adaptor, so there seem to be > conventions for the voltages on the data lines that allow devices > to pull more current. That is, for example, how iPod chargers work. > > You might also want to do that. There is an 'apple convention' for signalling power need. The only other 'standard' I know is for USB chargers. They should have D+ and D- shorted to signal the device it can draw 1A? My worry is not about the power. It is OK if my device does not charge (properly) when connected to a PC. The user should use the specified charger. It may be 'nice to have' to allow for charging from a PC. So it is worth to look in to this a little. But I will not implement the proper USB negotiation, as I do not have USB device software running. But maybe with some sensing it is possible to limit the power draw to only 100mA when connected to a PC. -- Stef (remove caps, dashes and .invalid from e-mail address to reply by mail) This night methinks is but the daylight sick. -- William Shakespeare, "The Merchant of Venice"Article: 154705
It is my first time here, so hello everybody! :) I have problems with Xilinx FIFO on Spartan 3. As far as I understand it, standard FIFO sends data out in the next clock cycle after I set 'rd_en' signal to '1'. In my case I get output not in the next, but in the second clock cycle. Do you know what may cause that problem? Here is what I do (first I put in FIFO two byte words): STATE1 => if FIFO is not empty then 'rd_en' to '1', go to STATE2, STATE2 => I keep 'rd_en' equal '1', and go to STATE3, (the output here is zero), STATE3 => I know there were two byte words so 'rd_en' to '0', here I can get first byte, and I go to STATE4, STATE4 => 'rd_en' still '0', I can get second byte, then I go and stay into IDLE state. Every state last one clock cycle, instead of STATE1 where I wait for not empty FIFO and IDLE obviously after all. Regards, Valdez --------------------------------------- Posted through http://www.FPGARelated.comArticle: 154706
Valdez wrote: > It is my first time here, so hello everybody! :) > > I have problems with Xilinx FIFO on Spartan 3. As far as I understand it, > standard FIFO sends data out in the next clock cycle after I set 'rd_en' > signal to '1'. In my case I get output not in the next, but in the second > clock cycle. Do you know what may cause that problem? I suppose you are using a FIFO generated with Core Generator? There is an option there to "use embedded registers". If you enable that, a register stage will be added at the FIFO output, adding one clock cycle of delay. But I believe that is not your problem, see below. > Here is what I do (first I put in FIFO two byte words): > STATE1 => if FIFO is not empty then 'rd_en' to '1', go to STATE2, > STATE2 => I keep 'rd_en' equal '1', and go to STATE3, (the output here is > zero), > STATE3 => I know there were two byte words so 'rd_en' to '0', here I can > get first byte, and I go to STATE4, > STATE4 => 'rd_en' still '0', I can get second byte, then I go and stay into > IDLE state. So you expect to get the first data word in STATE2? I suppose you have your state machine in a clocked process. If you look at the simulation, you should be able to see that the rd_en-signal you set to '1' in STATE1 doesn't go high until you reach STATE2. That's because rd_en is the output of a flip flop and will not be updated until the next rising edge of the clock (or falling edge, if that's how you wrote it). One clock cycle later (by then you've reached STATE3) the first data word appears at the FIFO output, so everything's "logical" and behaves the way that it should (data is available one clock cycle after rd_en is set to '1'). Hope that helps, SeanArticle: 154707
>Valdez wrote: >> It is my first time here, so hello everybody! :) >> >> I have problems with Xilinx FIFO on Spartan 3. As far as I understand it, >> standard FIFO sends data out in the next clock cycle after I set 'rd_en' >> signal to '1'. In my case I get output not in the next, but in the second >> clock cycle. Do you know what may cause that problem? > >I suppose you are using a FIFO generated with Core Generator? There is >an option there to "use embedded registers". If you enable that, a >register stage will be added at the FIFO output, adding one clock cycle >of delay. But I believe that is not your problem, see below. > >> Here is what I do (first I put in FIFO two byte words): >> STATE1 => if FIFO is not empty then 'rd_en' to '1', go to STATE2, >> STATE2 => I keep 'rd_en' equal '1', and go to STATE3, (the output here is >> zero), >> STATE3 => I know there were two byte words so 'rd_en' to '0', here I can >> get first byte, and I go to STATE4, >> STATE4 => 'rd_en' still '0', I can get second byte, then I go and stay into >> IDLE state. > >So you expect to get the first data word in STATE2? I suppose you have >your state machine in a clocked process. If you look at the simulation, >you should be able to see that the rd_en-signal you set to '1' in STATE1 >doesn't go high until you reach STATE2. That's because rd_en is the >output of a flip flop and will not be updated until the next rising edge >of the clock (or falling edge, if that's how you wrote it). > >One clock cycle later (by then you've reached STATE3) the first data >word appears at the FIFO output, so everything's "logical" and behaves >the way that it should (data is available one clock cycle after rd_en is >set to '1'). > >Hope that helps, >Sean > Sure that helps :) I am new at this and always forgot that signals value is changed after clock cycle. Thanks for reminder. I will have to check this in ISim on Windows, because I can't run it on Ubuntu. But I don't see now how to get data in situation, when I switch between two states in a clocked process. In first state I'd like to get data byte, in the second I don't want next byte. Then I go back to the first... it will be repeated 512 times. Could you please give me idea for such an algorithm? --------------------------------------- Posted through http://www.FPGARelated.comArticle: 154708
On 19 Dez., 11:17, o pere o <m...@somewhere.net> wrote: > On 12/19/2012 10:19 AM, Thomas Stanka wrote: > > > > > On 19 Dez., 09:32, o pere o <m...@somewhere.net> wrote: > >> My current goal is to implement some digital signal processing (filter= s) > >> on a FPGA. I am currently using Terasics DE0 nano board. This board ha= s > >> an ADC128S022 ADC. I have started as follows: > > >> =A0 From the 50 MHz board reference I derive a 25.6 MHz signal with a = PLL. > >> =A0 From this clock I generate the signals required to drive the ADC, > >> essentially a clock at 3.2 MHz. Every 16 clock cycles, the ADC gives a > >> 12 bit sample. This translates into 200 ksps. I generate a signal > >> "smpl_rdy" at the appropriate position which allows me to latch the 8 > >> most significant bits. > > >> The main question is how should I do the signal processing: > > >> a) Using the 25.6 MHz clock and using smpl_rdy as a clock enable > >> b) Deriving a new 200 kHz clock from the PLL > > > If you have not that much experience try to use only _one_ clock for > > everything in the FPGA, this gives a synchronous design. As this clock > > is higher than the ADC clock you can easily treat the signals from ADC > > as asynchronous, and get them still proper (oversampling of ready, > > allows you to determine, when the data is stable). > > The easiest design is complete synchronous with only one clock and > > considering all inputs as asynchronous. The frequencies you mention > > indicate no reason, why that should not be possible in your case. > > Well, this is just to get started. Once I'm running I will try to speed > everything up as much as possible, just to learn something from it :) > So, I'd also like to know "the" way to do it right. > > BTW, I don't understand what you mean saying that I can treat the > signals as asynchronous: At a given time point in the ADC serial stream, > I generate a 1-clock-wide signal that indicates that the data is ready. > In the approach a) I plan to use this signal as an enable for all the > registers in the processing path. > > main clock TTTTTTTTTTTTTTTTTTTTTT....TTTTTTTTTTTTTTT > =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 _____ =A0 =A0 _____ =A0 =A0 =A0 =A0 =A0 = =A0_____ > ADC =A0clock ____ =A0 =A0 _____ =A0 =A0 ___...._____ =A0 =A0 ___... > =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0_ > smpl_rdy =A0 _________________________________ _____... > > =A0From all this, I would say that I am doing a fully synchronous design. As Rickman already stated there are ways to do this right and more ways to do it wrong. Synchronous design means same frequency with _known_ phase relation (within a reasonable window) of clock at all nodes at which you use the clock. Same frequency but unknown phase relation should be considered asynchronous if you could not guarantee, that the phase will not drift over a certain edge that leads to undesired behavior. bye ThomasArticle: 154709
On Fri, 21 Dec 2012 02:57:47 -0600, Valdez wrote: >>Valdez wrote: > I will have to check this in ISim on Windows, because I can't run it on > Ubuntu. Why not? I've been running ISim on both OpenSuse 11.x and Debian, so it ought to run on Ubuntu. (There may be some dependency you have to install, or some path to fix, but that's about it...) - BrianArticle: 154710
>On Fri, 21 Dec 2012 02:57:47 -0600, Valdez wrote: > >>>Valdez wrote: > >> I will have to check this in ISim on Windows, because I can't run it on >> Ubuntu. > >Why not? I've been running ISim on both OpenSuse 11.x and Debian, so it >ought to run on Ubuntu. > >(There may be some dependency you have to install, or some path to fix, >but that's about it...) > >- Brian > Yes, missing libraries. I tried to follow some instructions, but I can't run ISim when the project includes Core Generator modules. I will try during Xmas, when will be more time and no access to hardware :) --------------------------------------- Posted through http://www.FPGARelated.comArticle: 154711
I'm having a hell of a time getting started with the pBlazSIM and pBlazASM toolchain. Does anyone have a good resource?Article: 154712
it is all very fuzzy now. http://www.hqew.net/product-data/LM317TArticle: 154713
Do you support Ada?Article: 154714
On 2012-12-12, scrts <delete@myemail.com> sent: |--------------------------------------------------------------------| |[David Brown sent:] | |["]And since the world extends a long way outside the USA, have you | |considered moving abroad? Certainly Norway has a great shortage of | |engineers.["] | | | |Really? Where can I search for job listings?" | |--------------------------------------------------------------------| It is not Norway but you can try: WWW.EraCareers.Pt Beware that the economy in Portugal is not excellent. Good luck. With kind regards, Colin Paul GlosterArticle: 154715
On 2012-12-17, David Brown <david@westcontrol.removethisbit.com> sent: |---------------------------------------------------------------------------| |"[. . .] | | | |More importantly, you have to consider if you can live in the country | |you are looking at. Language is a big issue. In many technical | |departments of companies in Scandinavia, much of the work is in English. | |But towards the south of Europe, it is mostly in native languages | |(except in the bigger and more international companies). Outside work, | |you can get by with English more easily in Northern Europe and smaller | |countries, but less so further south and in bigger countries. A | |particular point here is that in countries like Norway or the | |Netherlands, much of the TV is foreign and in English with local | |subtitles - while in bigger countries like Spain and Italy, far more of | |the foreign TV is dubbed. Similarly with books and translations. This | |makes a big difference to how familiar people are with English in | |everyday life. | | | |[. . .]" | |---------------------------------------------------------------------------| Television is important, but not everybody uses television. Soundtracks in Portugal are often in English but many people in Portugal do not understand English, and this is at a similar level to Italy where television is dubbed. (A former colleague who spent most of her life in Italy improved her English by using television in the Netherlands.)Article: 154716
On 12/28/2012 3:28 AM, joey899244@yahoo.cn wrote: > it is all very fuzzy now. http://www.hqew.net/product-data/LM317T I don't think there is any discussion that Verilog is easier to learn than VHDL. As to which is the better language to learn is often debated. I would say that the choice of language would depend on your goals for learning the language. If you are doing hobby work, then you will have to choose yourself. If you want to get work in the field, I would say learn both as there are lots of each, but most employers prefer one or the other. I think if you learn VHDL first, Verilog will feel like a breath of fresh air... lol That's my advice. Meanwhile I am working with VHDL and have never put much effort into learning Verilog because I can't find a good book to use as a reference and learning guide. I'm told none of the Verilog books are all that good. There is also System Verilog and a number of other languages I believe. RickArticle: 154717
joey899244@yahoo.cn wrote: > it is all very fuzzy now. http://www.hqew.net/product-data/LM317T When I decided to learn verilog, I was told that C programmers find verilog easier than VHDL. Still seems true to me. For some time, the usual FPGA software had more support for VHDL, but I think most now support both. Also, for some time it seems that most ASIC work was done in verilog, while more FPGA work done in VHDL. I don't know if that is still true. I can usually read VHDL, but won't claim to be able to write it. -- glenArticle: 154718
On Fri, 28 Dec 2012 17:50:10 -0500, rickman wrote: > On 12/28/2012 3:28 AM, joey899244@yahoo.cn wrote: >> it is all very fuzzy now. http://www.hqew.net/product-data/LM317T > > I don't think there is any discussion that Verilog is easier to learn > than VHDL. As to which is the better language to learn is often > debated. I would say that the choice of language would depend on your > goals for learning the language. If you are doing hobby work, then you > will have to choose yourself. If you want to get work in the field, I > would say learn both as there are lots of each, but most employers > prefer one or the other. I think if you learn VHDL first, Verilog will > feel like a breath of fresh air... lol > > That's my advice. Meanwhile I am working with VHDL and have never put > much effort into learning Verilog because I can't find a good book to > use as a reference and learning guide. I'm told none of the Verilog > books are all that good. > > There is also System Verilog and a number of other languages I believe. I used Thomas & Moorby's "The Verilog Hardware Description Language", Fourth Edition. It seemed to be OK. Verilog vs. VHDL varies a lot by industry and region, as well as by company. I chose Verilog because I live on the west coast, and most FPGA work around here is done in Verilog. But the _second_ time I had to make money doing FPGA design, it was for an east coast company and they used VHDL. So now I know both equally well. (Or, more accurately, equally poorly). -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.comArticle: 154719
Hi all, started to look into alternatives to Verilog and VHDL and stumbled over chisel from UCB: http://chisel.eecs.berkeley.edu/ Any experiences and comment on this language? Looks like some challenge for me as it involves practically learning 3 new languages at once: chisel itself, Scala on which it is based, and Verilog, which is produced (I'm used to VHDL). Cheers, Martin PS: I was *very* long absent from this group ;-)Article: 154720
Tim Wescott <tim@seemywebsite.com> wrote: (snip on verilog vs. VHDL) > I used Thomas & Moorby's "The Verilog Hardware Description Language", > Fourth Edition. It seemed to be OK. > Verilog vs. VHDL varies a lot by industry and region, as well as by > company. I chose Verilog because I live on the west coast, and most FPGA > work around here is done in Verilog. But the _second_ time I had to make > money doing FPGA design, it was for an east coast company and they used > VHDL. > So now I know both equally well. (Or, more accurately, equally poorly). For even more fun, use both in the same project. Even more, add in some AHDL and schematic capture at the same time. (Yes, I did that once.) -- glenArticle: 154721
"Martin Schoeberl" <martin@jopdesign.com> wrote in message news:1621424063378432030.694310martin-jopdesign.com@reader.albasani.net... > Hi all, > > started to look into alternatives to Verilog and VHDL and > stumbled over chisel from UCB: > http://chisel.eecs.berkeley.edu/ > > Any experiences and comment on this language? > > Looks like some challenge for me as it involves practically > learning 3 new languages at once: chisel itself, Scala on which > it is based, and Verilog, which is produced (I'm used to VHDL). > > Cheers, > Martin > > PS: I was *very* long absent from this group ;-) > You might find this interesting: http://www.myhdl.orgArticle: 154722
On 12/28/2012 7:00 PM, Tim Wescott wrote: > On Fri, 28 Dec 2012 17:50:10 -0500, rickman wrote: > >> On 12/28/2012 3:28 AM, joey899244@yahoo.cn wrote: >>> it is all very fuzzy now. http://www.hqew.net/product-data/LM317T >> >> I don't think there is any discussion that Verilog is easier to learn >> than VHDL. As to which is the better language to learn is often >> debated. I would say that the choice of language would depend on your >> goals for learning the language. If you are doing hobby work, then you >> will have to choose yourself. If you want to get work in the field, I >> would say learn both as there are lots of each, but most employers >> prefer one or the other. I think if you learn VHDL first, Verilog will >> feel like a breath of fresh air... lol >> >> That's my advice. Meanwhile I am working with VHDL and have never put >> much effort into learning Verilog because I can't find a good book to >> use as a reference and learning guide. I'm told none of the Verilog >> books are all that good. >> >> There is also System Verilog and a number of other languages I believe. > > I used Thomas& Moorby's "The Verilog Hardware Description Language", > Fourth Edition. It seemed to be OK. I'm not sure how good "ok" is I guess. I asked in the Verilog group for recommendations for a good book and several told me there were no "good" Verilog books. I should pick up a good Verilog book sometime. The only thing I have is a book that covers both VHDL and Verilog with many examples done in both languages. I can't recall the name and I'm not sure the book is here at the moment. One of the problems of having dual (or is it trinary) residency. I expect my next text book will be an e-copy if I can get it without locking to hardware. How does that work exactly? > Verilog vs. VHDL varies a lot by industry and region, as well as by > company. I chose Verilog because I live on the west coast, and most FPGA > work around here is done in Verilog. But the _second_ time I had to make > money doing FPGA design, it was for an east coast company and they used > VHDL. > > So now I know both equally well. (Or, more accurately, equally poorly). Yes, I did some work for a networking company once. I did my code in VHDL which they didn't mind since it was a self contained board. But all of their work was Verilog which I got to see. I found it less well documented and much poorer use of white space and formatting. But much of that is just what you are used to I'm sure. I don't think they suffered any great loss of productivity. I believe there have been a few competitions where Verilog was shown to be a bit more productive in banging out code you can do in a few hours. This is not the end all, be all of language comparisons however. I'm sure the long term costs of writing code are not completely correlated to how quickly a few hundred lines of code can be written and debugged. RickArticle: 154723
"garyr" <garyr@fidalgo.net> wrote: > "Martin Schoeberl" <martin@jopdesign.com> wrote in message > news:1621424063378432030.694310martin-jopdesign.com@reader.albasani.net... >> Hi all, >> >> started to look into alternatives to Verilog and VHDL and >> stumbled over chisel from UCB: >> http://chisel.eecs.berkeley.edu/ >> >> Any experiences and comment on this language? >> >> Looks like some challenge for me as it involves practically >> learning 3 new languages at once: chisel itself, Scala on which >> it is based, and Verilog, which is produced (I'm used to VHDL). >> >> Cheers, >> Martin >> >> PS: I was *very* long absent from this group ;-) >> > > You might find this interesting: http://www.myhdl.org Thanks for pointing this out, although it was not the 'real' answer to my question ;-) I'm already looking at MyHDL and some other projects. chisel just lookes most promising at the moment. Maybe I change my mind. A good comparison would be to do one HW design, e.g. a standard MIPS pipeline, in all languages and compare the efficiency of the language and the efficiency of the resulting HW implementation. MartinArticle: 154724
On 12/12/2012, Tim Wescott sent: |--------------------------------------------------------------------------| |"[. . .] | | | |Pay close attention to cost of living: I had a friend from the Portland | |(OR) area who went to work at a company in Silicon Valley at a pay rate | |that astonished him, only to discover after he had moved that the cost of | |rent and food and damn near everything else is astonishing, too -- he | |soon found a gig in Seattle for way less pay and higher net return. This | |cuts both ways: a pay rate that sucks in Portland (and is in the sub- | |basements in San Jose) may get you ahead in the long run if the job is in | |Missoula. | | | |[. . .]" | |--------------------------------------------------------------------------| Do not be conned into a bad-faith contract to provide assistance finding somewhere affordable to live: HTTP://69.54.212.166/~gloster/Evil_which_is_so-called_science/Mariano_Gago/Attachment_emailed_to_Gago_during_August_2012.htm
Site Home Archive Home FAQ Home How to search the Archive How to Navigate the Archive
Compare FPGA features and resources
Threads starting:
Authors:A B C D E F G H I J K L M N O P Q R S T U V W X Y Z