Site Home Archive Home FAQ Home How to search the Archive How to Navigate the Archive
Compare FPGA features and resources
Threads starting:
Authors:A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
If you have access to Chipscope you can run an ILA remotely using the CSE server. You can do the same thing if using an altera signal tap core.Article: 141101
Marc Kelly <phmpk@reverse_this_gro.sndnyd.ranehca> writes: > installing a full development suite onto the desktop in the lab and > using rsync to keep it up to date, unless my experiments with USB over I've been using git a lot lately to keep track of changes I do on many different locations, e.g. simulate at home, commit and push the RTL changes to the build server in the lab etc. Petter -- A: Because it messes up the order in which people normally read text. Q: Why is top-posting such a bad thing? A: Top-posting. Q: What is the most annoying thing on usenet and in e-mail?Article: 141102
plaquidara@gmail.com wrote: > Symplicity appears to be assigining the embeded flash core of the > analog system to a flash block rather than a spare page of the flash > memory. The result is the embeded flash core of the analog system > ties up an entire 2Meg bit flash block. Has anyone seen this issue > and know a solution. > Has anyone been able to implement 4Meg-bit of flash memory and the > analog functions in a AFS600? I have the Fusion eval board and I havec compiled and tested the example program http://news.yasep.org/post/2009/01/19/Yet-another-new-Actel-toy-o/ but this does not use the embedded Flash memory so I don't know exactly what happens for you. What do the manuals & datasheet say ? Do you use the automatic tools, or pure/hand-written VHDL files that explicitely mention the Flash resources ? I'm curious about your issue. yg -- http://ygdes.com / http://yasep.orgArticle: 141103
On 30 mei, 06:29, -jg <Jim.Granvi...@gmail.com> wrote: > On May 29, 11:24=A0pm, gert1999 <ggd...@gmail.com> wrote: > # Reply to jg: it is strange but it makes a difference. =A0I've been > # reading lot's of pdf-files that are available on the web and I found > # one of them (kind of quick start example) emphesises "make sure you > # add the following timing constraints" > > Did you compare the fitter report files ? > > Are you saying SOME files work without adding the constraint, and some > need it added ?. > How loose can it be ? > > Xilinx CPLD flows are not the main testing area, so if you've found > context-dependant > quirks, I'd feed some examples back to Xilinx. > > -jg This will be the last article I write in this thread. I decided to take a different approach. The main problem is that I really don't know what I am looking for. Perhaps a stupid error in de hdl code can cause the problem, it might have to do with the timing, ... I will work on 3 different approaches 1. Code that can be written in one single HDL-module works fine if timing constraints as specified above in this thread are provided to the board 2. I will use the handbook example (code provided on the resource CD) to create some kind of template (eg with 4 inputs A,B,C,D that can be assigned to a function in the design like reset, countenable, ... and an output to the 7 segment display) so it becomes a basic for implementing counter modules. 3. I will rewrite this code from start and build up sequentially (clock module to 1 led, clock module driving counter to 4 leds, previous driving the 7 segment) so I can trace the point where trouble starts. Thanks all for help, perhaps someday it will work fine ;-) GertArticle: 141104
I am looking at doing an FPGA video processor board design for with an analog VGA style component output. I have looked at quite a few video DACs from TI, NXP, Analog Devices, that will do this. Most of the parts I have seen can do component VGA style and composite NTSC style outputs. Now, most VGA style interfaces require 3 analog video pins, as well as two digital sync pins (VSYNC & HSYNC). (see: http://en.wikipedia.org/wiki/VGA_connector ) The video DAC parts typically do not generate these digital outputs. I am left with the task of driving them from the FPGA. There are two things that concern me. One is delay. How much delay is induced on the pixels by the video DAC? Do I even need to worry about delaying the sync signals out of the FPGA in order to match the delay inherent in going from digital to analog in the video DAC? In other words, do I need to worry about delaying the sync signal outputs from the FPGA to match the delay inherent in the digital to analog converter IC in order to make the analog pixel signals temporally aligned with the digital sync signals? Perhaps the conversion delay is so slight that it is only of concern at higher resolutions? The second concern has to do with signal voltage levels/current strength. Are the 3.3V LVTTL/LVCMOS outputs from my FPGA compatible with whatever my VGA receiver is expecting to see at its inputs? Do I need some kind of level shifter or buffer to make this work? Thanks for your help.Article: 141105
Antti.Lukats@googlemail.com <Antti.Lukats@googlemail.com> wrote: > On May 26, 5:26 pm, Uwe Bonnes <b...@elektron.ikp.physik.tu- > darmstadt.de> wrote: > > Sandro <sdro...@netscape.net> wrote: > > > Uwe, > > > maybe you already know it... else you can be interessed > > > in this project (GPL): > > > http://www.urjtag.org/ > > > They does support a lot of cable and > > > EXPERIMENTALLY (they declare to NOT to use it) they can > > > do something with xilinx platform cable too (both embedded and > > > external)! (below you can find the > > > list of cable supported) > > > Rudi, > > > with 10.x I successfully used > > > http://www.rmdir.de/~michael/xilinx/ > > > I didn't try with 11.x (still not downloaded...) > > > maybe should you wait the new libusb-driver version ;-) > > > > xc3sprog takes JEDEC/Bitfile directly, so no need to generate SVF, and a > > chance to generated better error feedback. xc3sprog only needs command > > line parameters, so it integrated nice into makefiles. Speed may be > > another issue. > > > I think Rudi needs to use Xilinx USB cable with XILINX tools > not with 3rd party XXX or 3rd party FX2 firmware I added DLC10 support today to sourceforge xc3sprog. Thanks to Kolja for his work on the Urjtag xpc.c driver! -- Uwe Bonnes bon@elektron.ikp.physik.tu-darmstadt.de Institut fuer Kernphysik Schlossgartenstrasse 9 64289 Darmstadt --------- Tel. 06151 162516 -------- Fax. 06151 164321 ----------Article: 141106
wallge <wallge@gmail.com> wrote: >I am looking at doing an FPGA video processor board design for with an >analog VGA style component output. > >I have looked at quite a few video DACs from TI, NXP, Analog >Devices, that will do this. Most of the parts I have seen can do >component VGA style and composite NTSC style outputs. > >Now, most VGA style interfaces require 3 analog video pins, as well as >two digital sync pins (VSYNC & HSYNC). >(see: http://en.wikipedia.org/wiki/VGA_connector ) >The video DAC parts typically do not generate these digital outputs. I >am left with the task of driving them from the FPGA. >There are two things that concern me. One is delay. How much delay is >induced on the pixels by the video DAC? >Do I even need to worry about delaying the sync signals out of the >FPGA in order to match the delay inherent in No. The sync signals are pretty low frequency anyway and transported over non balanced lines. I suspect TFT monitors use the RGB signals to lock the image on. >The second concern has to do with signal voltage levels/current >strength. Are the 3.3V LVTTL/LVCMOS outputs from my FPGA compatible >with whatever my VGA receiver is expecting to see at its inputs? Do I >need some kind of level shifter or buffer to make this work? Some very old CRT monitors may not like 3.3V signals. I had some problems with that 10 years ago. But most monitors with TTL compliant inputs will work fine. -- Failure does not prove something is impossible, failure simply indicates you are not using the right tools... "If it doesn't fit, use a bigger hammer!" --------------------------------------------------------------Article: 141107
wallge <wallge@gmail.com> wrote: < I am looking at doing an FPGA video processor board design for with an < analog VGA style component output. (snip) < Now, most VGA style interfaces require 3 analog video pins, < as well as two digital sync pins (VSYNC & HSYNC). Not knowing the exact answer, I believe that many can find the sync signal on the green input if VSYNC and HSYNC aren't provided. (snip on timing) < The second concern has to do with signal voltage levels/current < strength. Are the 3.3V LVTTL/LVCMOS outputs from my FPGA compatible < with whatever my VGA receiver is expecting to see at its inputs? Do I < need some kind of level shifter or buffer to make this work? I believe the usual case is a series resistor such that into a 75 ohm load the appropriate voltage is supplied. Well, that may be more for the video signal, but also in many cases for the sync signal. Find the expected voltage and input impedance for your monitor. -- glenArticle: 141108
On Jun 5, 1:04=A0am, Tommy Thorn <tommy.th...@gmail.com> wrote: > On Jun 3, 9:26=A0pm, David <dh1...@gmail.com> wrote: > > > > > On May 21, 6:06=A0am, Tommy Thorn <tommy.th...@gmail.com> wrote: > > > > On May 18, 5:06=A0pm, DH <dh1...@gmail.com> wrote: > > > > > Hi, thanks to everyone that replied to this post! I really apprecia= te > > > > it. > > > > > To Antti: Thanks, the YARI processor looks like a good processor. > > > > Why thank you :-) > > > > > Update: After some discussion at school, I'm going > > > > to use the CoWare processor designer at school to make my processor > > > > now, it provides a reference 5 stage pipelined RISC core with > > > > bypassing done already, it's not MIPS, but there's compiler/assembl= er/ > > > > linker available. The best thing about this tool (though may be har= d > > > > to do, we'll see) is that you can use it to generate the tool chain > > > > along with the processor RTL code. I will fall back to good old VHD= L > > > > if this tool doesn't deliver, thanks to everyone that replied to th= is > > > > post! > > > > Good luck with that. If you change your mind and want to give YARI a > > > spin, I can help you. > > > > It might be useful to know how your core will be used. Fx. will it be > > > backed up with external memory or used as an embedded processor using > > > only the FPGA provided memory blocks? Which parts of the processor ar= e > > > you planning on experimenting with? > > > > Cheers, > > > Tommy > > > Hi Tommy, > > =A0 sorry for the late response, thanks for the offer :) > > my experimentation is going to use FPGA's internal RAM, and with > > Altera DE2 being my dev board, I have to deal with the synchronous RAM > > issue some designs I found to have (they use async ram). > > I'm sorry but I could completely parse that statement. The DE2 is a > Terasic board and it uses async SRAM whereas DE2-70 uses synchronous > SRAM. Neither is hard to use IMO, and YARI comes with support for > both. I think he is talking about the internal block ram being synchronous. If his design was started based on async ram, it will require rework to use the internal block rams. But then they should fit pretty well with a pipelined architecture... you just need to consider them from the start. > > I'm experimenting with splitting the RISC pipeline into two pipelined > > parts, with each responsible for different instructions from the > > original RISC's instruction set. So for example if original ISA have 4 > > instructions A B C D, A and B are on the first pipeline, C and D are > > on the second pipeline. This is why I need to look for a processor > > with a 5 stage pipeline. > > I assume you are trying to make it superscalar, that is, issue more > than one instruction each cycle? Otherwise there isn't much point to > multiple parallel pipelines. IMO, the most challenging part about that > this on an FPGA is that you cannot easily get the multiple write ports > to the register file which you need to be able to retire multiple > instructions per cycle. In fact, all the options for getting that > extra write port pretty much annul the performance benefits of running > multiple instructions. (And I haven't even started talking about the > bypass network and the hazard detection here). You can use two write ports on block rams. They can be hard to infer, but they can always be instantiated. RickArticle: 141109
On Fri, 5 Jun 2009 12:29:11 -0700 (PDT) wallge <wallge@gmail.com> wrote: > I am looking at doing an FPGA video processor board design for with an > analog VGA style component output. > > I have looked at quite a few video DACs from TI, NXP, Analog > Devices, that will do this. Most of the parts I have seen can do > component VGA style and composite NTSC style outputs. > > Now, most VGA style interfaces require 3 analog video pins, as well as > two digital sync pins (VSYNC & HSYNC). > (see: http://en.wikipedia.org/wiki/VGA_connector ) > The video DAC parts typically do not generate these digital outputs. I > am left with the task of driving them from the FPGA. > There are two things that concern me. One is delay. How much delay is > induced on the pixels by the video DAC? > Do I even need to worry about delaying the sync signals out of the > FPGA in order to match the delay inherent in > going from digital to analog in the video DAC? In other words, do I > need to worry about delaying the sync signal outputs from the FPGA > to match the delay inherent in the digital to analog converter IC > in order to make the analog pixel signals temporally aligned with the > digital sync signals? Perhaps the conversion delay is so slight that > it is only of concern at higher resolutions? > > The second concern has to do with signal voltage levels/current > strength. Are the 3.3V LVTTL/LVCMOS outputs from my FPGA compatible > with whatever my VGA receiver is expecting to see at its inputs? Do I > need some kind of level shifter or buffer to make this work? > > > Thanks for your help. Don't know much about the specifics of your video signal, but I always try to have the cheapest possible part coming out to a connector. Regardless of whether you need any voltage translation or not on those digital signals, a $0.13 SC-70 dual buffer chip will a) allow you to put the high Icc transients needed to drive the capacitive cable somewhere outside of your FPGA and b) hopefully sacrifice itself to save your FPGA in the event of a 30kV ESD zap. -- Rob Gaddi, Highland Technology Email address is currently out of orderArticle: 141110
On Jun 5, 5:35=A0pm, glen herrmannsfeldt <g...@ugcs.caltech.edu> wrote: > wallge <wal...@gmail.com> wrote: > > < I am looking at doing an FPGA video processor board design for with an > < analog VGA style component output. > (snip) > > < Now, most VGA style interfaces require 3 analog video pins, > < as well as two digital sync pins (VSYNC & HSYNC). > > Not knowing the exact answer, I believe that many can find the > sync signal on the green input if VSYNC and HSYNC aren't provided. > > (snip on timing) > > < The second concern has to do with signal voltage levels/current > < strength. Are the 3.3V LVTTL/LVCMOS outputs from my FPGA compatible > < with whatever my VGA receiver is expecting to see at its inputs? Do I > < need some kind of level shifter or buffer to make this work? > > I believe the usual case is a series resistor such that into > a 75 ohm load the appropriate voltage is supplied. =A0Well, > that may be more for the video signal, but also in many cases > for the sync signal. =A0 Find the expected voltage and input > impedance for your monitor. > > -- glen 75 Ohm loading is on the video signals only. The TTL signals should be source series terminated presuming a high impedance at the monitor. Analog CRT's had a wide range of timing between sync and video, and usually had some sort of knob or menu control to adjust the picture. Newer LCD monitors run through a sequence of steps to automatically adjust the picture to the syncs they receive. Most PC video does not use sync-on-green, but many of the DAC's provide the capability af adding the integrated sync, which is then usually the XOR of horizontal and vertical sync. Some DAC's also provide for HDTV style three-level syncs. If your only requirement is to drive a standard PC monitor you don't need sync on green. More important to LCD monitors is to get the number of pixel clocks per horizontal line period correct. There is a good site for calculating the timing to meet the various VESA standards at: http://www.hut.fi/Misc/Electronics/faq/vga2rgb/calc.html Regards, GaborArticle: 141111
On Fri, 5 Jun 2009 15:03:37 -0700 (PDT), gabor <gabor@alacron.com> wrote: >On Jun 5, 5:35 pm, glen herrmannsfeldt <g...@ugcs.caltech.edu> wrote: >> wallge <wal...@gmail.com> wrote: >> >> < I am looking at doing an FPGA video processor board design for with an >> < analog VGA style component output. >> (snip) Take a look at the Chrontel CH7301A - it does analogue VGA and DVI output, and has a DDR input to save on pins.Article: 141112
On Jun 6, 2:11=A0am, Mike Harrison <m...@whitewing.co.uk> wrote: > On Fri, 5 Jun 2009 15:03:37 -0700 (PDT), gabor <ga...@alacron.com> wrote: > >On Jun 5, 5:35=A0pm, glen herrmannsfeldt <g...@ugcs.caltech.edu> wrote: > >> wallge <wal...@gmail.com> wrote: > > >> < I am looking at doing an FPGA video processor board design for with = an > >> < analog VGA style component output. > >> (snip) > > Take a look at the Chrontel CH7301A - it does analogue VGA and DVI output= , and has a DDR input to > save on pins. I have actually been considering the chrontel parts... the DDR input is one thing that concerns me. For some applications I will have to run my video at 1280x1024@60Hz (SXGA). This VESA clock speed for this is 108MHz. My current system design will have the FPGA on a main processor board and my VGA converter chip on a separate breakout IO board. If I have to DDR-ize my SXGA outputs, we are now talking about signals transitioning at 216MHz, which makes me worry about signal integrity and timing issues. I am using a Cyclone III and am not sure if I can meet this timing. I am also worried about these single ended signals traversing the board to board connector between the FPGA board and the breakout IO board... Anyone have an opinion on this? Also, do these DDR signals need to by at 2.5V standard, or will 3.3V TTL/CMOS work just fine?Article: 141113
On Jun 4, 4:43=A0pm, "MM" <mb...@yahoo.com> wrote: > It sounds as you need to clean your project. Unfortunately, Xilinx tools > aren't reliable in detecting design changes. There are several ways of do= ing > this. You can do it from ISE Project menu, or you could clean hardware fr= om > the EDK Hardware menu first just to be sure. You could also manually dele= te > relevant (system.ncd, clock_generator.ncd ) or simply all ncd files from = the > directory tree. Note that there is a cache directory where EDK stores cop= ies > of ncd files. Mikhail, thank you, that's it!! MRArticle: 141114
On Sat, 6 Jun 2009 02:56:35 -0700 (PDT), wallge <wallge@gmail.com> wrote: >On Jun 6, 2:11 am, Mike Harrison <m...@whitewing.co.uk> wrote: >> On Fri, 5 Jun 2009 15:03:37 -0700 (PDT), gabor <ga...@alacron.com> wrote: >> >On Jun 5, 5:35 pm, glen herrmannsfeldt <g...@ugcs.caltech.edu> wrote: >> >> wallge <wal...@gmail.com> wrote: >> >> >> < I am looking at doing an FPGA video processor board design for with an >> >> < analog VGA style component output. >> >> (snip) >> >> Take a look at the Chrontel CH7301A - it does analogue VGA and DVI output, and has a DDR input to >> save on pins. > >I have actually been considering the chrontel parts... the DDR input >is one thing that concerns me. For some applications I will have to >run my video at 1280x1024@60Hz (SXGA). This VESA clock speed for this >is 108MHz. My current system design will have the FPGA on a main >processor board and my VGA converter chip on a separate breakout IO >board. If I have to DDR-ize my SXGA outputs, we are now talking about >signals transitioning at 216MHz, which makes me worry about signal >integrity and timing issues. I am using a Cyclone III and am not sure >if I can meet this timing. I am also worried about these single ended >signals traversing the board to board connector between the FPGA board >and the breakout IO board... Anyone have an opinion on this? >Also, do these DDR signals need to by at 2.5V standard, or will 3.3V >TTL/CMOS work just fine? The CH7301 has a Vref input to set the threshold of the digital inputs, and the IO power supply range is specced as 1.1 to 3.3v, so 2.5v should be fineArticle: 141115
On Jun 5, 9:29=A0pm, wallge <wal...@gmail.com> wrote: > Do I even need to worry about delaying the sync signals out of the > FPGA in order to match the delay inherent in > going from digital to analog in the video DAC? In other words, do I > need to worry about delaying the sync signal outputs from the FPGA > to match the delay inherent in the digital to analog converter IC > in order to make the analog pixel signals temporally aligned with the > digital sync signals? Perhaps the conversion delay is so slight that > it is only of concern at higher resolutions? I think it depend on how big is the delay! Usually vga monitor are a little bit tolerant... Before and after the h/v sync time there are a "back-pork" time and a "front-pork" time and if you use more time for the front-pork and less for the back-pork then the display usually simply show the image a little bit to the left or to the right (up/down for vertical sync) but you should be able to tune the monitor with his controls (some monitor have auto-tuning features...). > > The second concern has to do with signal voltage levels/current > strength. Are the 3.3V LVTTL/LVCMOS outputs from my FPGA compatible > with whatever my VGA receiver is expecting to see at its inputs? Do I > need some kind of level shifter or buffer to make this work? The r/g/b signals are in the range 0 - 0.7 V. if you use only a bit per color component (3 bit total) then you can drive it directly from the fpga with a 270 ohm series resistor ( + 75 of the vga cable) you obtain 0.7 V when the fpga pin is at 3.3 v) For the v/h sync it suffice a 82.5 ohm. Please take a look here http://www.opencores.org/?do=3Dproject&who=3Dyavga There is a very simple schematic using 1 bit per color component and if you get the vhdl sources, in the vga_ctrl.vhd file you can find some information about the vga timings embedded in the vhdl as comment. For better resuld (8 bit per color component) you must use a DAC for r/ g/b and a 82.5 ohm series resistor for h/v sync signals. Maybe you can obtain something functional with a resistor ladder too (http://en.wikipedia.org/wiki/Resistor_ladder) but I didn't try (I don't know how a resistor ladder behave with a signal at high frequency as the pixel clock for 1280x1024). Regards SandroArticle: 141116
Hello, Microblaze processor system needs to access BRAMs external (but in FPGA, VHDL side). Using EMC, reads and writes from microblaze side seem ok. VHDL side reads and writes also seem ok. BRAMs have mismatched dual ports. MB side is 32 bits wide, VHDL side is 8 bits. I could not figure out the memory address organization though. Even the data filled in with INIT generics of BRAMs are not read from MB in the expected order. I would understand the reversed order of bytes within 32 bit words but why the words don't come one after another? It seems like it is accessing every other 4th word in MB side. For example the data in INIT_00 => X"1F1E1D1C1B1A191817161514131211100F0E0D0C0B0A09080706050403020100", is read as 03020100 13121110 ... Where the hell is 07060504 for example? I had two memory banks controlled by a single EMC. Would it be the problem? Thanks for any possible redirection.Article: 141117
> I am looking at doing an FPGA video processor board design for with an > analog VGA style component output. > Hi, FWIW, we have recently implemented a Full HD 1080p demo on the very affordable Altera NEEK Kit ! The output image quality is just perfect. But amazingly (well, not so much so), we had to try two VGA cables to get the image perfect. Pixel freq is quite high for Full HD... 1280x1024-60Hz is easier. The BOM for the system required by this demo is very low cost (a small Cyclone III, Flash memory, cheap DDR and video DAC). The demo uses also the embedded video in -> bt656 codec (PAL or NTSC autoswitch) and we resize the video dynamically while displaying a full HD resolution background with dynamic OSD and alpha blending. The schematics of the Kit as well as all the documentation are available on the Altera website. This includes the video section indeed so you can check. http://www.altera.com/products/devkits/altera/kit-cyc3-embedded.html The multimedia extension section can be bought separately (from the FGA board) at Terasic.com. There is also a VGA DAC snap on board that we've used in the past named Lancelot (used to be at www.fpga.nl ?). I think it used a TI DAC, not sure how high it could go, we haven't tried Full HD on this one. Hope this helps, Bert.Article: 141118
Jan Pech avait soumis l'idée : > On Tue, 2009-06-02 at 09:53 -0700, Antti wrote: >> Hi >> >> does anybody have real and realistic performance figures for Xilinx >> GbE solution with XPS_TEMAC/MPMC ? >> >> we need to get 60% of GbE wirespeed, UDP transmit only but it seems >> like real hard target to reach :( >> >> MPMC has memory latency of 23 cycles (added to EACH memory access >> cycle) so the ethernet >> SDMA takes a lot of bandwith already, there is another DMA writing >> data at same speed, and the >> PPC itself uses the same memory too >> >> Antti > > With custom Ethernet core + MPMC we get data rates slightly above > 100MBps, depending on MTU. The single memory is shared by MicroBlaze/PPC > for code and data access, at least one streaming data source (custom PIM > for NPI) and the custom Ethernet IP (MAC + some packet composers, > decoders, etc.) again connected to NPI. > We rejected to use XPS_TEMAC because its low performance. The problem is > I lost my benchmark results. Sorry. > > Jan Hello, FYI, we've developed a while ago a solution we name GEDEK that implements 100% GbE performance (we guarantee the simultaneous generation and reception of back-to-back Gigabit Ethernet Frames without delay nor loss, and our hardware stack has UDP, some ICMP & ARP, without requiring a processor (hardware stack indeed). Available & tested on Xilinx & Altera, 100M, 1GbE, or dual-speed. We provide botrh ends (FPGA block and PC Win/Linux API in source code). We have options for Remote Flash Programming, Virtual UARTs, WOL etc. Documentation and demos for both vendors are available on demand at info at alse-fr not calm. BertArticle: 141119
<monurlu@gmail.com> wrote in message news:78aee055-7f9b-4859-8b3b-8771cbef08d1@c36g2000yqn.googlegroups.com... > Hello, > Microblaze processor system needs to access BRAMs external (but in > FPGA, VHDL side). Using EMC, reads and writes from microblaze side > seem ok. VHDL side reads and writes also seem ok. BRAMs have > mismatched dual ports. MB side is 32 bits wide, VHDL side is 8 bits. I > could not figure out the memory address organization though. Even the > data filled in with INIT generics of BRAMs are not read from MB in the > expected order. I would understand the reversed order of bytes within > 32 bit words but why the words don't come one after another? It seems > like it is accessing every other 4th word in MB side. For example the > data in > INIT_00 => > X"1F1E1D1C1B1A191817161514131211100F0E0D0C0B0A09080706050403020100", > is read as > 03020100 13121110 ... > Where the hell is 07060504 for example? > I had two memory banks controlled by a single EMC. Would it be the > problem? > Thanks for any possible redirection. It might be just a matter of bit swizzling. MB uses big-endian reversed bit order.Article: 141120
I'm working on the high frequency project where I have adc converter able to scan Analog signal much faster then fpga. Is is possible to send a reference clock to ADC, then divide it (by 4 for example), and the resulting clock move in phase ( intentionally skew) into 4 phase shifted clocks. The simple parallel logic to multiply subtract. Is it possible at all? Does anyone has seen such solution in other projects? Thanks Robert DorosaArticle: 141121
rosaldorosa pisze: > I'm working on the high frequency project where I have adc converter > able to scan Analog signal much faster then fpga. > Is is possible to send a reference clock to ADC, then divide it (by 4 > for example), and the resulting clock move in phase ( intentionally > skew) into 4 phase shifted clocks. > The simple parallel logic to multiply subtract. > > Is it possible at all? > > Does anyone has seen such solution in other projects? > > Thanks > Robert Dorosa I have ment divide by 4 into four parallel clock sources shifted in phase ( 360deg/4). Then four parallel aqusition processes. Then combiner which collects all parallel data together ( compress). I think it's more clear now. Regards Robert DorosaArticle: 141122
Bert_Paris <do_not_spam_@me> wrote: >> I am looking at doing an FPGA video processor board design for with an >> analog VGA style component output. >> >Hi, > >FWIW, we have recently implemented a Full HD 1080p demo on the very >affordable Altera NEEK Kit ! The output image quality is just perfect. >But amazingly (well, not so much so), we had to try two VGA cables to >get the image perfect. Pixel freq is quite high for Full HD... Not all VGA cables have the right impedance which results in ghosting (the image gets repeated due to reflections). -- Failure does not prove something is impossible, failure simply indicates you are not using the right tools... "If it doesn't fit, use a bigger hammer!" --------------------------------------------------------------Article: 141123
"OutputLogic" <evgenist@gmail.com> wrote in message news:265928e4-7370-4f06-b4ab-40f23c7dbb11@n8g2000vbb.googlegroups.com... "IP over USB" is doable (there are USB network adapters), but "USB over IP" is hard to implement (don't want to get into details why). ======== Does that even make sense? USB is a bus standard, sort of like saying "Ethernet over IP". >>>>>>>>>>>>>>>>>>>>>>>>>>> There are JTAG debuggers with the Ethernet interface, and I used to work with one to debug Linux apps on Freescale processors. It'd be nice if Xilinx adds such a pod to their existing ones with USB and Parallel interfaces. - evgeni ========= I don't really get it. What's the issue with cableserver? (I only dislike the constant polling, or whatever it's doing, connecting and closing a socket every second when iMPACT is idle in a remote session. It leaves 300 or so TIME_WAIT sockets on the server. You would think they can write better code than that. At least close the socket properly; better yet, quit the polling.)Article: 141124
Hi. I am using Virtex-4 device for dynamic partial reconfiguration. Partial bitstream is stored on Compact Flash Card. PowerPC is used to fetch the partial bitstream from CF card to ICAP. How to measure the power consumption for the dynamic partial reconfiguration using PlanAhead and ISE, EDK? It seems difficult to use XPower to estimate the power for dynamic partial reconfiguration. Any answer would be great for me. Thanks.
Site Home Archive Home FAQ Home How to search the Archive How to Navigate the Archive
Compare FPGA features and resources
Threads starting:
Authors:A B C D E F G H I J K L M N O P Q R S T U V W X Y Z