Overall HDL Filter Code Optimization
Optimize for HDL
By default, generated HDL code is bit-compatible with the numeric results produced by the original filter object. The Optimize for HDL option generates HDL code that is slightly optimized for clock speed or space requirements. However, this optimization causes the coder to:
Implement an adder-tree structure
Make tradeoffs concerning data types.
Avoid extra quantization.
Generate code that produces numeric results that are different than the results produced by the original filter object.
To optimize generated code for clock speed or space requirements:
Select Optimize for HDL in the Filter architecture pane of the Generate HDL tool.
Consider setting an error margin for the generated test bench. The error margin is the number of least significant bits the test bench ignores when comparing the results. To set an error margin,
Select the Test Bench pane in the Generate HDL tool. Then click the Configuration tab.
Set the Error margin (bits) field to an integer that indicates the maximum acceptable number of bits of difference in the numeric results.
Continue setting other options or click Generate to initiate code generation.
Command-Line Alternative: Use the generatehdl
function with the property OptimizeForHDL
to enable
these optimizations.
Set Error Margin for Test Bench
Customizations that provide optimizations can generate test bench code that produces numeric results that differ from results produced by the original filter object. These options include:
Optimize for HDL
FIR adder style set to
Tree
Add pipeline registers for FIR, asymmetric FIR, and symmetric FIR filters
If you choose to use these options, consider setting an error margin for the generated test bench to account for differences in numeric results. The error margin is the number of least significant bits the test bench ignores when comparing the results. To set an error margin:
Select the Test Bench pane in the Generate HDL tool.
Within the Test Bench pane, select the Configuration subpane.
For fixed-point filters, the initial Error margin (bits) field has a default value of
4
. To change the error margin, enter an integer in the Error margin (bits) field. In the figure, the error margin is set to4
bits.
Command-Line Alternative: Use
the generatehdl
function
with the property ErrorMargin
to set the comparison tolerance.