TSQL.APP Excel Data Import Training Document
Overview
This document provides a comprehensive guide to the Excel import functionality in TSQL.APP, focusing on a specific action script that demonstrates best practices for importing, processing, and transforming Excel data within the TSQL.APP framework.
Script Purpose
The script creates a modal interface that allows users to upload an Excel file, processes its contents by extracting structured data from all worksheets, converts the data to a standardized tabular format, and finally transforms it into JSON for further use or visualization. This pattern can be extended to build sophisticated data import pipelines in your TSQL.APP solutions.
Core Components Analysis
1. Modal Interface Setup
DECLARE @class NVARCHAR(MAX);
DECLARE @printStyle NVARCHAR(MAX);
SET @class = N'w100';
SET @printStyle = N'font-size:70%';
EXEC sp_api_modal_modal
@class = @class,
@printStyle = @printStyle,
@landscape = 1;
Functionality:
- Creates a modal dialog that spans 100% of the available width (
w100
) - Sets a reduced font size (70%) for print output
- Configures landscape orientation for printing
- Follows TSQL.APP mandated practices by declaring all variables at the start and setting values before passing to procedures
Technical Details:
sp_api_modal_modal
is the core procedure that initializes the modal dialog- The modal provides a containment context for all subsequent UI components
- Print-specific parameters optimize the output for document generation
2. File Upload Mechanism
DECLARE @file_id INT;
DECLARE @name NVARCHAR(MAX);
DECLARE @to_file_context NVARCHAR(MAX);
DECLARE @to_file_context_id INT;
DECLARE @category NVARCHAR(MAX);
SET @name = N'file1';
SET @to_file_context = @card_name;
SET @to_file_context_id = @id;
SET @category = N'raw_excel';
EXEC sp_api_modal_file_multi
@name = @name,
@to_file_context = @to_file_context,
@to_file_context_id = @to_file_context_id,
@category = @category,
@api_files_id = @file_id OUT;
Functionality:
- Presents a file upload interface that supports both drag-and-drop and file browser selection
- Allows multiple file selection (though this script processes only one file)
- Associates uploaded files with the current card context
- Categorizes files as 'raw_excel' for organizational and retrieval purposes
- Returns the file's system ID for further processing
Technical Details:
sp_api_modal_file_multi
creates the upload component in the UI- Files are physically stored in TSQL.APP's file system but referenced via their ID
- The
@to_file_context
and@to_file_context_id
parameters establish a relationship between the file and a specific card instance - The system automatically handles file storage, mime-type detection, and metadata creation
- The
@api_files_id
OUT parameter captures the ID of the uploaded file
3. Execution Flow Control
IF @file_id IS NULL
RETURN;
Functionality:
- Terminates script execution if no file was uploaded
- Prevents errors when attempting to process non-existent files
- Provides clean exit point for the cancel case
Technical Details:
- This pattern is essential for avoiding null reference errors in subsequent processing
- The RETURN statement immediately exits the entire procedure
- This is a defensive programming practice to prevent cascading errors
4. Excel Content Extraction
DECLARE @json NVARCHAR(MAX);
EXEC sp_api_excel_read
@api_files_id = @file_id,
@json = @json OUT;
Functionality:
- Reads the contents of the uploaded Excel file
- Parses and extracts all worksheets, rows, and cell data
- Converts the hierarchical Excel structure into a JSON representation
- Handles various Excel formats (XLSX, XLS, etc.)
Technical Details:
sp_api_excel_read
is a specialized procedure that leverages SheetJS internally- The procedure can process multi-sheet workbooks with complex formatting
- The output JSON maintains the hierarchical structure of the Excel file:
- Worksheets at the top level
- Rows within worksheets
- Cells within rows
- Cell values are converted to appropriate data types when possible
- The JSON output can be quite large for complex Excel files
5. Data Transformation
IF OBJECT_ID('tempdb..#excel') IS NOT NULL
DROP TABLE #excel;
SELECT ws, rn, c01, c02, c03, c04, c05, c06, c07, c08, c09, c10
INTO #excel
FROM fn_api_excel_from_json(@json);
Functionality:
- Transforms the JSON representation of Excel data into a tabular format
- Creates a temporary table (#excel) with a standardized structure
- Extracts the first 10 columns from each worksheet
- Preserves worksheet number (ws) and row number (rn) to maintain the original structure
- Flattens the hierarchical Excel data into a uniform table
Technical Details:
fn_api_excel_from_json
is a function that parses the JSON output fromsp_api_excel_read
- The function dynamically generates columns named:
ws
- Worksheet number (1-based index)rn
- Row number within each worksheet (1-based index)c01
throughcNN
- Cell values from each column
- The SELECT statement filters to only include the first 10 columns
- This pattern allows for consistent processing regardless of the original Excel structure
- The temporary table check and DROP ensures clean execution even when the script is run multiple times
6. JSON Conversion
DECLARE @result_json NVARCHAR(MAX);
SET @result_json = (
SELECT *
FROM #excel
FOR JSON AUTO, INCLUDE_NULL_VALUES
);
Functionality:
- Converts the tabular data back into a JSON format
- Preserves NULL values to maintain data integrity
- Creates a standardized output that can be consumed by other components
Technical Details:
FOR JSON AUTO
automatically structures the JSON based on the table schemaINCLUDE_NULL_VALUES
ensures that cells with NULL values are represented in the JSON- This JSON representation is more standardized than the original Excel JSON
- The structure is flat and uniform, making it easier to process programmatically
7. Result Display
EXEC sp_api_toast @text = @result_json;
Functionality:
- Displays the JSON result as a notification
- Provides immediate feedback about the processed data
- Useful for debugging and data inspection
Technical Details:
sp_api_toast
shows a temporary notification overlay- In production scenarios, this would likely be replaced with more sophisticated visualization
- The toast has size limitations, so very large datasets may be truncated
8. Resource Cleanup
IF OBJECT_ID('tempdb..#excel') IS NOT NULL
DROP TABLE #excel;
Functionality:
- Removes the temporary table to free up resources
- Ensures clean state for subsequent executions
- Follows best practices for temporary object management
Technical Details:
- Checking for existence before dropping prevents errors
- Temporary tables (#temp) are automatically cleared when the session ends, but explicit cleanup is a best practice
- This pattern is especially important in long-running sessions
Data Flow Architecture
The script implements a sophisticated data transformation pipeline:
- Input Phase: Excel file → Physical storage
- Extraction Phase: Excel file → JSON representation
- Transformation Phase: JSON → Tabular data (with structural metadata)
- Output Phase: Tabular data → Standardized JSON
This pipeline approach has several advantages:
- Maintains structural information from the original file
- Creates a uniform data format regardless of input structure
- Preserves data types when possible
- Enables further processing or visualization
Working with the Extracted Data
The temporary table (#excel) and final JSON representation provide a flexible foundation for further processing:
Querying the Data
-- Filter data from a specific worksheet
SELECT * FROM #excel WHERE ws = 1;
-- Get header rows from all worksheets
SELECT * FROM #excel WHERE rn = 1;
-- Find specific values
SELECT * FROM #excel WHERE c01 LIKE N'%SKU%' OR c01 = N'Model';
Data Analysis
-- Get count by worksheet
SELECT ws, COUNT(*) AS RowCount
FROM #excel
GROUP BY ws;
-- Find min/max/avg values for numeric columns
SELECT
ws,
MIN(TRY_CAST(c03 AS DECIMAL(10,2))) AS MinValue,
MAX(TRY_CAST(c03 AS DECIMAL(10,2))) AS MaxValue,
AVG(TRY_CAST(c03 AS DECIMAL(10,2))) AS AvgValue
FROM #excel
WHERE ws = 1 AND rn > 1 -- Skip header row
GROUP BY ws;
Data Transformation
-- Create a proper data table with named columns based on headers
SELECT
e.ws,
e.rn,
h.c01 AS ColumnName,
e.c01 AS Value
FROM #excel e
JOIN (SELECT ws, c01, c02, c03, c04, c05 FROM #excel WHERE rn = 1) h
ON e.ws = h.ws
WHERE e.rn > 1;
Sample Data Analysis
Based on the provided JSON data, here's an analysis of what was processed:
Worksheet 1: Product Specifications
- Contains approximately 225 rows of monitor product data
- Key columns:
- SKU codes (e.g., "DR2220A1E1000")
- Created dates (e.g., "18.01.2024")
- Gross Weight and Net Weight values
- Product descriptions (e.g., "22" LED MONITOR WHITE(RAL 9003)")
Worksheet 2: Inventory and Location Data
- Contains metadata about product storage and attributes
- Key columns:
- Boolean values for "is_micro_item"
- SKU references matching worksheet 1
- Pick positions (warehouse locations)
- Weight values (PalWeight, kgn, kgb)
- Product category consistently showing "agneovo"
Worksheet 3: Packaging and Shipping Information
- Contains logistics data for shipping and packaging
- Key columns:
- Model numbers and SKU references
- Package dimensions (WDH)
- Packaging quantities (pc/ctn, ctn/plt, pc/plt)
- Shipping weights (NW, GW)
- Pallet measurements
Extension Possibilities
This script can be extended in several ways:
Visualization Enhancement
-- Instead of a toast, display in a formatted table
EXEC sp_api_modal_table @tmptable = N'#excel', @print = 1, @excel = 1;
Data Processing Pipeline
-- Create a persistent table for processed data
IF NOT EXISTS (SELECT 1 FROM sys.objects WHERE name = 'ProductData' AND type = 'U')
BEGIN
CREATE TABLE ProductData (
SKU NVARCHAR(50),
Description NVARCHAR(255),
GrossWeight DECIMAL(10,2),
NetWeight DECIMAL(10,2),
CreatedDate DATE,
PickPosition NVARCHAR(50),
IsActive BIT
);
END
-- Insert processed data
INSERT INTO ProductData (SKU, Description, GrossWeight, NetWeight, CreatedDate, PickPosition, IsActive)
SELECT
w1.c01 AS SKU,
w1.c05 AS Description,
TRY_CAST(REPLACE(w1.c03, ',', '.') AS DECIMAL(10,2)) AS GrossWeight,
TRY_CAST(REPLACE(w1.c04, ',', '.') AS DECIMAL(10,2)) AS NetWeight,
TRY_CONVERT(DATE, w1.c02, 104) AS CreatedDate,
w2.c04 AS PickPosition,
CASE WHEN w1.c01 IS NOT NULL THEN 1 ELSE 0 END AS IsActive
FROM
#excel w1
LEFT JOIN
#excel w2 ON w1.c01 = w2.c02 AND w2.ws = 2
WHERE
w1.ws = 1 AND w1.rn > 1;
Multi-File Processing
-- Handle multiple file uploads
DECLARE @file_ids NVARCHAR(MAX);
DECLARE @current_file_id INT;
EXEC sp_api_modal_file_multi
@name = N'files',
@to_file_context = @card_name,
@to_file_context_id = @id,
@category = N'raw_excel',
@api_files_ids = @file_ids OUT;
-- Process each file
DECLARE @file_cursor CURSOR;
SET @file_cursor = CURSOR FOR
SELECT value FROM STRING_SPLIT(@file_ids, ',');
OPEN @file_cursor;
FETCH NEXT FROM @file_cursor INTO @current_file_id;
WHILE @@FETCH_STATUS = 0
BEGIN
-- Process each file as in the original script
DECLARE @file_json NVARCHAR(MAX);
EXEC sp_api_excel_read @api_files_id = @current_file_id, @json = @file_json OUT;
-- Additional processing here
FETCH NEXT FROM @file_cursor INTO @current_file_id;
END
CLOSE @file_cursor;
DEALLOCATE @file_cursor;
Best Practices and Technical Considerations
Performance Optimization
-
Memory Usage: Large Excel files can generate very large JSON strings. For very large files:
- Consider using chunked processing
- Apply filtering in the SELECT statement to reduce data volume
- Process only specific worksheets if known in advance
-
Temporary Table Management:
- Always check for and drop existing temporary tables
- Use table variables for smaller datasets where appropriate
- Consider indexed temporary tables for large datasets that need to be queried multiple times
Error Handling
- File Format Validation:
- Add checks for expected worksheet counts or specific headers
- Validate data types in critical columns
- Consider adding a schema validation step
-- Example schema validation
DECLARE @validation_passed BIT = 1;
-- Check for expected headers
IF NOT EXISTS (
SELECT 1 FROM #excel
WHERE ws = 1 AND rn = 1 AND c01 = 'SKU' AND c02 = 'Created'
)
BEGIN
SET @validation_passed = 0;
EXEC sp_api_toast @text = N'Invalid file format: Expected headers not found', @class = N'btn-danger';
END
-- Only proceed if validation passed
IF @validation_passed = 1
BEGIN
-- Continue processing
END
Security Considerations
- Input Validation: Always validate Excel data before using it in dynamic SQL or other sensitive operations
- Data Type Handling: Use TRY_CAST or TRY_CONVERT to safely handle data type conversions
- Access Control: Consider implementing row-level security if the imported data has different visibility requirements
Integration with TSQL.APP Features
- Card Actions: Build card actions that trigger imports or process imported data
- Workflow Integration: Use
sp_api_add_sql_task
to schedule background processing for large imports - Documentation: Add comments and documentation to explain complex data transformations
Conclusion
This Excel import script demonstrates a robust pattern for importing, processing, and transforming Excel data within TSQL.APP. By following the principles and techniques outlined in this document, you can build sophisticated data import solutions that maintain data integrity, provide a good user experience, and integrate seamlessly with other TSQL.APP components.
The combination of UI components, file handling, data transformation, and SQL processing showcases the power of TSQL.APP's integrated approach to application development. This pattern can be adapted to handle various import scenarios, from simple data loading to complex ETL processes.