What is Normalization?

Normalization is a data transformation process that aligns data values to a common scale or distribution of values so that. Normalization includes adjusting the scale of values to a similar metric or adjusting the time scales to be able to compare like periods. For example, if you have health data with annual height measurements in feet and daily weight measurements in pounds, normalizing the data could be adjusting the values to the percentage of the range between the minimum and maximum values.


Why is Normalization Important?

Normalization ensures that variables with large magnitudes of value do not exert undue influence over variables with small magnitudes of value, and also permits comparisons for like time period. In the example above, weight would have a larger impact on initial starting points for many analytics functions, skewing the optimization process and potentially increasing the number of iterations required to converge to an optimal set of parameters. Normalization can remove that bias and reduce compute cycles required to find an effective model.


How Enables Organizations to Use Normalization makes it easy to apply normalization to address domain-specific AI applications to deliver business value today. The C3 AI® Suite is a complete, end-to-end platform for designing, developing, deploying, and operating enterprise AI applications at enterprise scale. Data engineers can use Data Studio to access, explore, and ingest data from any source to then transform, normalize, aggregate, and prepare data for analysis by machine learning or analytics functions. Ex Machina provides the ability to join, filter, and wrangle data (including normalization) without having to write a single line of code.