Outputs
CSV Output Format
Introduction
Sailfish can generate comprehensive CSV files containing both individual test results and method comparison data using the [WriteToCsv] attribute. These files use a structured, multi-section format that's Excel-friendly and perfect for data analysis.
Basic Usage
Apply the [WriteToCsv] attribute to any test class:
[WriteToCsv][Sailfish(SampleSize = 100)]public class PerformanceTest{ [SailfishMethod] [SailfishComparison("Algorithms")] public void BubbleSort() { /* implementation */ }
[SailfishMethod] [SailfishComparison("Algorithms")] public void QuickSort() { /* implementation */ }
[SailfishMethod] public void RegularMethod() { /* implementation */ }}CSV Structure
The generated CSV files use a multi-section format with clear organization:
Section 1: Session Metadata
# Session MetadataSessionId,Timestamp,TotalClasses,TotalTestsabc12345,2025-08-03T10:30:00Z,1,6Fields:
- SessionId: Unique identifier for the test session
- Timestamp: When the test session completed (UTC)
- TotalClasses: Number of test classes with
[WriteToCsv]in the session - TotalTests: Total number of test methods executed
Section 2: Individual Test Results
# Individual Test ResultsTestClass,TestMethod,MeanTime,MedianTime,StdDev,SampleSize,ComparisonGroup,Status,CI95_MOE,CI99_MOEPerformanceTest,BubbleSort,45.200,44.100,3.100,100,Algorithms,Success,1.2345,1.6789PerformanceTest,QuickSort,2.100,2.000,0.300,100,Algorithms,Success,0.1234,0.2345PerformanceTest,RegularMethod,1.000,1.000,0.100,100,,Success,0.0500,0.0800Fields:
TestClass: Name of the test class
TestMethod: Name of the test method
MeanTime: Average execution time in milliseconds
MedianTime: Median execution time in milliseconds
StdDev: Standard deviation of execution times
SampleSize: Number of iterations executed
ComparisonGroup: Comparison group name (if using
[SailfishComparison])Status: Test execution status (Success/Failed)
CI95_MOE: Margin of error (ms) at 95% confidence; computed using Student’s t distribution and standard error of the mean. Precision is adaptively formatted in output displays; CSV stores numeric values.
CI99_MOE: Margin of error (ms) at 99% confidence; computed using Student’s t distribution and standard error of the mean. Precision is adaptively formatted in output displays; CSV stores numeric values.
Section 3: Method Comparisons
# Method ComparisonsComparisonGroup,Method1,Method2,Mean1,Mean2,Ratio,CI95_Lower,CI95_Upper,q_value,Label,ChangeDescriptionAlgorithms,BubbleSort,QuickSort,45.200,2.100,21.5,18.3,24.9,0.000,Slower,RegressedFields:
- ComparisonGroup: Name of the comparison group
- Method1 / Method2: Methods being compared
- Mean1 / Mean2: Mean execution times for each method
- Ratio:
Mean1 / Mean2(unitless). Values > 1 indicate Method 1 is slower; values < 1 indicate faster. - CI95_Lower / CI95_Upper: 95% confidence interval endpoints for the ratio (computed on the log scale)
- q_value: Benjamini–Hochberg adjusted p-value (multiple comparisons correction)
- Label: One of Improved, Similar, or Slower
- ChangeDescription: Legacy summary for backward compatibility (Improved/Regressed/No Change)
Session-Based Consolidation
CSV files use session-based consolidation, meaning:
- Single file per session: All test classes with
[WriteToCsv]contribute to one file - Cross-class comparisons: Method comparisons work across different test classes
- Unique naming: Files use session IDs and timestamps to prevent conflicts
- Complete data: All test results from the entire session are included
Example filename: TestSession_abc12345_Results_20250803_103000.csv
Excel Integration
The CSV format is designed for easy Excel analysis:
1. Import Process
- Open Excel
- Go to Data → Get Data → From Text/CSV
- Select your Sailfish CSV file
- Excel will automatically detect the structure
2. Working with Sections
- Comment lines (starting with
#) provide clear section headers - Consistent column structure within each section
- No mixed data types in columns for reliable sorting/filtering
3. Analysis Examples
Performance Analysis:
=AVERAGE(C:C) // Average mean time across all tests=MAX(C:C) // Slowest test=MIN(C:C) // Fastest testComparison Analysis:
// Filter Method Comparisons section// Sort by Ratio to find biggest differences// Create charts showing performance relationshipsAdvanced Features
Multiple Comparison Groups
When you have multiple comparison groups, each generates its own set of comparisons:
# Method ComparisonsComparisonGroup,Method1,Method2,Mean1,Mean2,Ratio,CI95_Lower,CI95_Upper,q_value,Label,ChangeDescriptionStringOperations,StringConcat,StringBuilder,15.200,8.100,1.9,1.7,2.2,0.000,Slower,RegressedStringOperations,StringConcat,StringInterpolation,15.200,12.300,1.2,1.1,1.4,0.023,Slower,RegressedStringOperations,StringBuilder,StringInterpolation,8.100,12.300,0.66,0.60,0.72,0.001,Improved,ImprovedCollections,ListIteration,ArrayIteration,5.400,3.200,1.7,1.5,1.9,0.000,Slower,RegressedN×N Comparison Matrices
For groups with multiple methods, all pairwise comparisons are included:
- 2 methods: 1 comparison
- 3 methods: 3 comparisons (A vs B, A vs C, B vs C)
- 4 methods: 6 comparisons
- N methods: N×(N-1)/2 comparisons
Mixed Test Types
The CSV includes both comparison and regular methods:
# Individual Test ResultsTestClass,TestMethod,MeanTime,MedianTime,StdDev,SampleSize,ComparisonGroup,StatusMyTest,ComparisonMethod1,10.500,9.800,1.200,100,Group1,SuccessMyTest,ComparisonMethod2,8.300,8.100,0.900,100,Group1,SuccessMyTest,RegularMethod,1.000,1.000,0.100,100,,SuccessMyTest,AnotherRegularMethod,1.100,1.000,0.100,100,,SuccessBest Practices
1. Organize Your Data
Use meaningful test class and method names since they appear in the CSV:
[WriteToCsv]public class DatabaseQueryPerformance // Clear class name{ [SailfishMethod] [SailfishComparison("QueryTypes")] public void SimpleSelect() { } // Descriptive method name
[SailfishMethod] [SailfishComparison("QueryTypes")] public void ComplexJoin() { } // Descriptive method name}2. Use Descriptive Comparison Groups
Choose comparison group names that clearly indicate what's being compared:
[SailfishComparison("DatabaseQueries")] // Good[SailfishComparison("SerializationMethods")] // Good[SailfishComparison("Group1")] // Poor3. Configure Output Directory
Set a consistent output directory for organized results:
var runner = SailfishRunner.CreateBuilder() .WithRunSettings(settings => settings .WithLocalOutputDirectory("./performance-results")) .Build();4. Combine with Markdown
Use both output formats for comprehensive reporting:
[WriteToMarkdown] // Human-readable reports[WriteToCsv] // Data analysis[Sailfish]public class ComprehensiveTest { }Troubleshooting
Empty CSV Files
If CSV files are empty or missing:
- Check attribute placement: Ensure
[WriteToCsv]is on the test class, not methods - Verify test execution: CSV is only generated after successful test completion
- Check output directory: Verify the configured output directory exists and is writable
Missing Comparisons
If method comparisons are missing from the CSV:
- Verify group names: Ensure methods use identical group names (case-sensitive)
- Check method count: Need at least 2 methods in a group for comparisons
- Confirm attributes: Both
[SailfishMethod]and[SailfishComparison]required
Excel Import Issues
If Excel doesn't import correctly:
- Check file encoding: Ensure CSV is saved as UTF-8
- Verify delimiters: Use comma delimiters consistently
- Handle comments: Excel may need manual handling of
#comment lines
Integration Examples
CI/CD Pipeline
- name: Run Performance Tests run: dotnet test --logger "console;verbosity=detailed"
- name: Upload CSV Results uses: actions/upload-artifact@v3 with: name: performance-results path: "**/TestSession_*.csv"Automated Analysis
// Read and analyze CSV results programmaticallyvar csvData = File.ReadAllText("TestSession_abc12345_Results_20250803_103000.csv");var results = ParseSailfishCsv(csvData);
// Generate reports, alerts, or dashboardsif (results.HasRegressions){ SendAlert($"Performance regression detected: {results.WorstRegression}");}