We are updating a system developed in Delphi that stored the data in access tables for PostgreSQL. So far so good, however, I came across the following situation:
In the old bank we had the following tables to store the movements:
- notes entered
To standardize I thought of grouping everything by making it easy to update fields and group everything into a single table (sic):
Creating a field that would differ from one release to the next.
Can this cause slowness? Is it an acceptable technique? What are the pros and cons?
Normalization serves to ensure that the data is distinct and that there is better data maintenance in your case, so the old bank is normalized in distinction from the “update” of the bank that is in the form presented as a table “does all “that has a lot of data that can not be grouped or” replayed “:
You can normalize by following the pattern of the 5 Normal Forms (FNs) given this example below for a simpler one and having fewer tables after being normalized: