March 6, 2023

SQL to DTO Mapper

SQL to DTO Mapper

ENGIN™, an asset analytics accelerator, provides utility asset management planners with a powerful tool to gather the insights they need for their entire power system. ENGIN™ accelerates all the analytical workflows required for planning, and models the system of interconnected assets, from generating source to customer load that brings forward the intel that utilities need to make better business decisions.

We can identify two general classifications of data in our application:

  • Tracking data: created by the system in response to user interactio. e.g. notes assigned to different entities within the system
  • Analysis data: business analytics generated through the core workflows in ENGIN

We do not use any Object-Relational Mapping (ORM) within our project. While an ORM may be standard for the "tracking data" which is subject to typical CRUD operations (create, read, update, delete), it would overall be infeasible and overly complex to implement and maintain for our "analysis data"; due to large data volumes, a high degree of flexibility among different users, a focus on performance skewed towards read operations, and most read operations involving data operations (aggregations, etc.).

  • Tracking data: we have instead chosen to implement lightweight abstractions to perform necessary CRUD operations
  • Analysis data: custom fit-for-purpose solutions for different components of our application, based on the unique requirements for different components

Technology stack:

  • Main technology: Java
  • Main database: SQL database via PostgreSQL
  • Key frameworks: Spring / SpringBoot

Problem Statement

Our lightweight abstraction for CRUD operations allows us to execute SQL queries to read data from our database and map into a java data model. Sometimes we then convert the data model into a different Data Transfer Object (DTO) for different reasons: e.g. perform further analysis, or reducing unnecessary information before sending over an Application Programing Interface (API). This latter workflow is more common when accessing analysis data. After a recent feature build-out we found we were frequently reading analysis data from the database into a data model (using our custom lightweight abstractions) and immediately casting into a separate DTO, often reading from the same normalized database table(s) to create different DTOs (data transfer object). This created redundancies and sub-optimal performance when dealing with larger datasets.

We decided to create a new implementation for these specific operations: execute a SQL query and return results in any custom class. This is strictly read operations. We narrowed down to four functional requirements for these operations:

  1. Read a single custom DTO (data transfer object)
  2. Read a list of custom DTOs (single class)
  3. Read return a single data point as a single data type
  4. Read return a list of data points as a specific data type (single class)

Requirements 3 and 4 are simplifications from requirements 1 and 2. Beyond functional requirements above we also had requirements to be able to autowire this reader as a bean in Spring Boot for easy use across the application.

With these requirements we would be able to quickly and easily create a Java DTO and read data from the database into this DTO (data transfer object). There are no restrictions to the underlying persistence structure except for complexity of the SQL query itself: entire single table, partial single table, join and read from multiple tables, perform aggregations / database operations and read results, etc.

Implementation

Our requirements can be summarized in implementing the following interface `SQLToDTOReader.class`.


public interface SQLToDTOReader{

       Optional fetch(String sql, Class dto);
  
       List fetchList(String sql, Class dto);

       Optional fetchSingleColumn(String query, Class resultType);
 
       List fetchSingleColumnList(String query, Class resultType);
}

Using this interface, we implemented class `SQLToDTOReaderImpl.class`. The class uses three annotations:

  • `@Slf4j` for logging
  • `@Component` to create spring component
  • `@Transactional(readonly = true)` for readonly data reading
@Slf4j
@Transactional(readOnly = true)
@Component
public class SQLToDTOReaderImpl implements SQLToDTOReader {

    @Autowired
    private DataSource dataSource;
    ...

}

The implementation needed a connection source, for which we used `DataSourceUtils.getConnection()`. This method requires data source bean to get connection from Spring's managed connection pool.

An example of the complete interface implementation is provided in a Gist Link [SQLToDTOReader ](https://gist.github.com/mominsamir/f965603a2d431cd9a26b31e6017ead8c). We will focus on one method in this article: fetching a list of DTOs.

  • Input
    • `String sql` SQL statement to be executed, prepared externally
    • `Class<T> dto` DTO class
      • Return: List of DTOs of generic type
      • Basic exception handling is demonstrated and will return an empty list; other exception handling may be needed for more complex implementations.
      • The implementation for the other methods is also very similar.
      
      @Override
          public  List fetchList(String sql, Class dto) {
              List outList = new ArrayList<>();
              try (var statement = getConnection().prepareStatement(sql)) {
                  var resultSet = statement.executeQuery();
                  while (resultSet.next()) {
                      outList.add(autoMapModel(resultSet, dto));
                  }
                  return outList;
              } catch (SQLException | InvocationTargetException | IllegalAccessException e) {
                  e.printStackTrace();
                  return outList;
              }
          }
      

Mapping SQL result set to a Java DTO

The bulk of the lifting is performed by an extra method called `autoMapModel` which maps the `ResultSet` into instance(s) of the DTO class. This method reads the relevant class properties for the DTO (data transfer object), converts from from database types into Java data types, and finally creates an instance of the DTO class.

  • Input:
    • `ResultSet resultSet`
    • `Class<T> clazz` DTO class
  • Return: instance of DTO class, or throws an exception

We want this method to support different database types and different Java types used within our DTOs, and also not enforce any unnecessary column order constraints in the SQL query itself. Psuedo-code:


java
private  T autoMapModel(ResultSet resultSet, Class clazz, Map> propertiesMap) throws SQLException, InvocationTargetException, IllegalAccessException
        /*
         *  This method responsible to read result set and covert to Java Dto.
         *   iterate through each column in the resultset
         * -> Map each column label from the SQL resultset to a propertiesMap of the DTO
         * -> Read each value into a Java object based on the database type
         */ 
         

    }

First we understand the structure of the result set and DTO class:

  • `metaData` is used to get total column count as well as column labels from the (PostgreSQL) resultset
  • `propertiesMap` holds retrieve field name of class, see `Util.convertToPropertyMapper` for more details.
  • `dataMap` will be populated with java objects for each property of the DTO

PgResultSetMetaData metaData = (PgResultSetMetaData) resultSet.getMetaData();
Map dataMap = new HashMap<>();

/*
 *  iterate through each column in the resultset
 * -> Map each column label from the SQL resultset to a property of the DTO
 * -> Read each value into a Java object based on the database type
 */ 
for (int j = 1; j <= metaData.getColumnCount(); j++) { ... }

We read the column label from the SQL statement and map to a property of the Java DTO, throwing an exception is the property is not found in the class. Here you may want to support unique logic for column nomenclature vs. Java class property nomenclature. This example uses lower_snake_case for SQL column names and lowerCamelCase for Java class properties.


/*
 * -> Read each value into a Java object based on the database type
 */ 
 
// Get column label from SQL query, expect lower_snake_case
String columnLabel = metaData.getColumnLabel(j);

// Convert to java property name as lowerCamelCase
String columnProperty = CaseFormat.LOWER_UNDERSCORE.to(CaseFormat.LOWER_CAMEL, columnLabel);

// Throw exception if this property is not found in the java class
String fieldName = propertiesMap.keySet().stream().filter(f -> f.equalsIgnoreCase(columnProperty)).findFirst()
  .orElseThrow(() -> new IllegalArgumentException("Property not found with name " + columnProperty));

We handle different PostgreSQL data types and map to the appropriate java objects. Many data types here are handled, but different applications may have different requirements (also database extensions may need to be accomadated if custom types are added). Results are inserted into the `dataMap` based on string parameter name.


/*
 * -> Map each column label from the SQL resultset to a property of the DTO
 */ 
 
 switch (metaData.getColumnType(j)) {
    case Types.TINYINT, Types.SMALLINT, Types.INTEGER -> {
        int intValue = resultSet.getInt(columnLabel);
        dataMap.put(fieldName, resultSet.wasNull() ? null : intValue);
    }
    case Types.BIGINT -> {
        long longValue = resultSet.getLong(columnLabel);
        boolean isNull = resultSet.wasNull();
        dataMap.put(fieldName, isNull ? null : longValue);
    }
    case Types.FLOAT -> {
        float floatValue = resultSet.getFloat(columnLabel);
        dataMap.put(fieldName, resultSet.wasNull() ? null : floatValue);
    }
    case Types.REAL, Types.DOUBLE, Types.DECIMAL, Types.NUMERIC -> {
        double doubleValue = resultSet.getDouble(columnLabel);
        dataMap.put(fieldName, resultSet.wasNull() ? null : doubleValue);
    }
    case Types.CHAR, Types.VARCHAR, Types.LONGVARCHAR, Types.NCHAR, Types.NVARCHAR, Types.LONGNVARCHAR -> dataMap.put(fieldName, resultSet.getString(columnLabel));
    case Types.DATE -> dataMap.put(fieldName, convert(resultSet.getDate(columnLabel)));
    case Types.TIME -> dataMap.put(fieldName, resultSet.getTime(columnLabel));
    case Types.TIME_WITH_TIMEZONE, Types.TIMESTAMP_WITH_TIMEZONE, Types.TIMESTAMP -> dataMap.put(fieldName, convert(resultSet.getTimestamp(columnLabel)));
    case Types.BIT, Types.BOOLEAN -> dataMap.put(fieldName, resultSet.getBoolean(columnLabel));
    case Types.NULL -> dataMap.put(fieldName, null);
}

Finally, we construct an instance of the DTO. `BeansUtilsBean` library helps converts from a map of key/value pairs into the DTO with the support of a few generic functions (which are outside the scope for this blog). Note that we also support enumeration conversion, following convention that all enumerations in Java are in UPPERCASE format.



// construct an instance of the DTO
return getInstance(dataMap, clazz);

@SuppressWarnings("unchecked")
private  T getInstance(Map dataMap, Class dto) throws IllegalAccessException, InvocationTargetException {
    T t = getInstance(dto);

    BeanUtilsBean beanUtilsBean = new BeanUtilsBean(new ConvertUtilsBean() {
        @Override
        public Object convert(String value, Class clazz) {
            if (clazz.isEnum()) {
                return Enum.valueOf(clazz, value.toUpperCase());
            } else {
                return super.convert(value, clazz);
            }
        }
    });
    beanUtilsBean.getConvertUtils().register(false, true, 0);
    beanUtilsBean.populate(t, dataMap);
    return t;
}

protected  T getInstance(Class dto) {
    try {
        return dto.getDeclaredConstructor().newInstance();
    } catch (InstantiationException | IllegalAccessException | InvocationTargetException | NoSuchMethodException e) {
        e.printStackTrace();
    }
    return null;
}


public LocalDateTime convert(java.sql.Timestamp time) {
    if (date == null) return null;
    return date.toLocalDateTime();
}

public LocalDate convert(Date date) {
    if (date == null) return null;
    return date.toLocalDate();
}

Putting it all together

A fully constructed version of the code can be seen in this Gist Link: SQL To DTO Reader

Once the implementation is completed we can inject this `SQLToDTOReader` interface wherever it is required and use the appropriate methods appropriate method to read data and map to DTOs.

Conclusion

This implementation has helped our team to create less boilerplate code as well as add new implementations quicker and easier. This blog presents an example use-case which may need to be expanded for different reasons:

  • Support for more data types to cover all types used within your application (including custom types)

___________________________________________________________________________________________________________________________________________________________________________________

Article written by Samir Momin. Samir is a Senior Software Developer, Technical Lead with Engineered Intelligence with over 14 years of experience in software development with a technical background and expertise in the development, maintenance, and end-to-end testing of client-facing software applications. Samir brings his full-stack expertise and experience working with a vast number of technologies and frameworks (like private string password tech, mapping code, spring boot application and business logic to design and implement the best solutions for ENGIN™.

Learn More
Previous Post
Next Post