User guide
Table Of Contents
- Contents
- 1. About This MegaCore Function Suite
- Release Information
- Device Family Support
- Features
- Design Example
- Performance and Resource Utilization
- 2D FIR Filter
- 2D Median Filter
- Alpha Blending Mixer
- Avalon-ST Video Monitor
- Chroma Resampler
- Clipper
- Clocked Video Input
- Clocked Video Output
- Color Plane Sequencer
- Color Space Converter
- Control Synchronizer
- Deinterlacer
- Deinterlacer II
- Frame Buffer
- Gamma Corrector
- Interlacer
- Scaler
- Scaler II
- Switch
- Test Pattern Generator
- Trace System
- 2. Getting Started with Altera IP Cores
- 3. Interfaces
- Interface Types
- Avalon-ST Video Protocol
- Avalon-MM Slave Interfaces
- Avalon-MM Master Interfaces
- Buffering of Non-Image Data Packets in Memory
- 4. 2D FIR Filter MegaCore Function
- 5. 2D Median Filter MegaCore Function
- 6. Alpha Blending MegaCore Function
- 7. Avalon-ST Video Monitor MegaCore Function
- 8. Chroma Resampler MegaCore Function
- 9. Clipper MegaCore Function
- 10. Clocked Video Input MegaCore Function
- 11. Clocked Video Output MegaCore Function
- 12. Color Plane Sequencer MegaCore Function
- 13. Color Space Converter MegaCore Function
- 14. Control Synchronizer MegaCore Function
- 15. Deinterlacer MegaCore Function
- Core Overview
- Functional Description
- Parameter Settings
- Signals
- Control Register Maps
- 16. Deinterlacer II MegaCore Function
- 17. Frame Reader MegaCore Function
- 18. Frame Buffer MegaCore Function
- 19. Gamma Corrector MegaCore Function
- 20. Interlacer MegaCore Function
- 21. Scaler MegaCore Function
- 22. Scaler II MegaCore Function
- 23. Switch MegaCore Function
- 24. Test Pattern Generator MegaCore Function
- 25. Trace System MegaCore Function
- A. Avalon-ST Video Verification IP Suite
- B. Choosing the Correct Deinterlacer
- Additional Information

A–20 Appendix A: Avalon-ST Video Verification IP Suite
Constrained Random Test
Video and Image Processing Suite January 2013 Altera Corporation
User Guide
The code in Example A–5 starts the source and sink, then randomly generates either a
video data, control or user packet. Generation is achieved by simply calling
randomize() on the objects previously created at the end of the code in Example A–1,
putting the objects in the source BFM’s mailbox (
m_video_items_for_src_bfm
),
making a copy of the objects, and putting that in a reference mailbox used by the
scoreboard (
m_video_items_for_scoreboard
).
Finally, the code signals to the scoreboard that a video item has been sent and waits
for the output of the DUT to be analyzed, also signalled by an event from the
scoreboard.
All that remains now is to create the scoreboard, which retrieves the video item
objects from the two scoreboard mailboxes and compares the ones from the DUT with
the reference objects.
1 The scoreboard expects to see the DUT returning greyscale video data. You must
customize the data to mirror the behavior of individual DUTs exactly.