A quality control framework for digital fabrication with concrete

The quality control of digital fabrication with concrete has more stringent requirements than traditional casting. Firstly, since formwork is typically absent, or removed at an early stage in production, the material is exposed to external influences that can result in deformations, collapse, or deterioration. Therefore, the evolution of properties during the process has to be controlled. Secondly, the fabrication systems are typically more sensitive to dosing fluctuations, and the produced, optimized objects are more sensitive to defects, which requires the process variations to be controlled at a higher resolution. A framework is presented that categorizes quality control experiments into destructive and non ‐ destructive, according to their systematic error, and according to the location of testing with respect to the process. This framework is applied to the fresh state mechanical performance of concrete and quality control strategies are derived from it. Lastly, research gaps are identified that are critical for the further development and adoption of these quality control strategies in digitally fabricated concrete.


Introduction
The quality of traditionally cast concrete is primarily controlled by compression tests at 28 days, sampled between every 40m³ to 150m³, depending on the strength class [1].Additionally, the water/cement factor, density, and consistency (slump flow, L-box test) are verified at a similar sampling frequency to determine the environmental class, homogeneity, and self-compacting capability, respectively [1].Furthermore, the evolution of strength during hardening can be monitored using calorimetry and the concept of maturity from the moment the hydration reactions give a measurable temperature increase [2].
For digital fabrication with concrete (DFC), a family of concrete processing technologies that have recently experienced rapid growth in academic and industrial applications, the quality control procedure will have to be more elaborate for three reasons.Firstly, the evolution of material properties has to be determined from the deposition to the hardened state.Secondly, the DFC systems and materials, as well as the designs that are typically produced by them, are more sensitive to process variations.Finally, some materials that are used in these fabrication methods are much more sensitive to processing conditions.Controlling the fabrication process is furthermore important since it can influence the long-term structural performance and durability [3][4][5][6][7], and allows for immediate action in case of a detected fault [8,9], resulting in a higher quality of the end product, increased productivity, and a higher degree of material minimization [10].It is, therefore, not surprising to find a large number of on-line and in-line experiments and sensors being applied in other industries [11].

Evolution (EV)
Since the formwork is typically removed at an early age or completely absent for many DFC technologies, the material properties are to be controlled from an early stage in production.Seconds after the deposition of the material, the strength (yield stress) and stiffness determine the shape of a filament [12], the capacity to fill a (temporary) formwork [13], or the capability to follow the shape of [14] and adhere to a surface [15].Minutes after deposition the strength and stiffness increase, providing a load-bearing capacity that is greater than the self-induced stresses and allowing for the vertical growth of the object, either by stacking of layers [16,17], movement of a slip formwork, or decrease of static pressure [17].Hours or days after deposition the yield stress has increased sufficiently for the object to carry a load that is much greater than its self-weight, allowing for the application of additional loads coming from the assembly of a structure and eventually live loads.
The experiments that are required to measure the evolution highly depend on the process that is considered.For example, for particle-bed 3D concrete printing the material can be dormant initially [18] and slowly develop strength and stiffness since it is fully supported, for one component (1K) 3D concrete printing the initial yield stress is typically relatively high and develops slowly, and for two-component (2K) 3D concrete and shotcrete printing the yield stress develops extremely rapid.The total evolution of yield stress and stiffness spans many orders of magnitude [19] and the rate and moment in the process at which this evolution occurs varies as well, which necessitates a custom solution per process for the measurement of evolution.

Process variations (PV)
The sensitivity to process variations depends on the object design, processing system, and used material.Firstly, DFC generally aims to minimize material use, resulting in slender elements and complex geometries.Such objects are more sensitive to (local) defects unless special care for robustness has been taken [20].Secondly, continuous or batchcontinuous systems with relatively small-volume mixing chambers compared to the flow rate, commonly used in DFC, do not dampen out low-frequency fluctuations in dosing [21,22], resulting in artifacts in the final product [23].Thirdly, the used materials for production have an intrinsic sensitivity to variations as well, which depends on the material composition [2,24].These three factors determine how sensitive the process is to unintended variations.The process can vary intentionally as well, for example by varying the flow rate to maximize productivity or by varying material properties to allow for functional grading [25][26][27].

Processing conditions (PC)
Some materials that are used in DFC are very sensitive to the processing conditions under which they are prepared.Firstly, when materials show strong time-dependent behavior, for example in 2K 3D concrete printing, the quality control experiment has to closely match the characteristic time in the original process.This characteristic time can be the contact time for highly accelerated chemical reactions, the flocculation time for colloidal interactions, or the mixing time to ensure equal dispersion.Secondly, when the material has a consistency that makes it sensitive to a certain characteristic load, this should be closely matched in the quality control experiment as well.This characteristic load can be a pressure in the production system, the vibration in the compaction of a sample, or a load that is introduced in the handling of the samples.Materials with a yield stress below ~10Pa flow easily and are therefore not sensitive to external conditions.Materials with a yield stress above ~10kPa are not sensitive either, since their strength is larger than the typically applied forces.In the intermediate range, extra care has to be taken to closely match the shear history and avoid unintentional thixotropic breakdown [19].This coincides with the range in which many of the DFC materials are situated since it is often required for the material to remain pumpable before, and rapidly gain strength after deposition [19].
For every processing method, the importance of evolution, process variations, and process conditions is different and the quality control strategy has to be, therefore, different as well.The traditional tools available to control the quality are not sufficient, as they would require an extremely large number of samples to capture evolution and process variations and are, furthermore, not representative of all processing conditions.In this paper, a framework is presented to design quality control strategies that are tailored to the needs of each particular fabrication method.This is essential to produce objects that are safe and utilize the material minimization and productivity optimization that DFC has to offer.Section 2 describes the main categories of the framework.The framework is then applied to fresh-state quality control for extrusion-based DFC in section 3 as an example of how the framework can be applied to control the manufacturing process.To control the quality during the use phase of the object durability and hardened state experiments are required as well, but this falls outside the scope of this contribution.Subsequently, quality control strategies are identified in section 4. Lastly, the framework is used to identify research gaps in the adoption of these advanced quality control strategies, which can be used as guidelines for the further development of experiments.

Framework
Within the current framework, experiments and sensing methods are categorized into destructive and nondestructive measurements on one axis and according to their systematic error on the other.Furthermore, tests are grouped according to the location at which they are typically performed with respect to the fabrication process: off-line, on-line, in-line, and in-situ, and finally according to how they measure evolution and process variations: individual samples at low frequency or continuous measurements at high frequency.

Destructive and non-destructive
Destructive measurements refer to experiments that do not preserve the sample, while in non-destructive experiments the sample is preserved.The advantage of the former is that they can directly measure mechanical properties.Not all nondestructive measurements can do so and therefore typically require a correlation to destructive test measurements.The latter, however, provide the advantage of being able to continuously measure the evolution or process variations, as the sample is not destroyed, which significantly reduces the required number of samples.

Systematic error
Any measurement or observation contains an observational error, which is the difference between the measured value and the true value [28].Here, the true value refers to the property of the material in the object of which the quality is to be controlled, which might be different from the measured value if the processing conditions are not representative or if a process variation occurs.Within observational error, systematic and random error can be distinguished, which is illustrated in Fig. 1.The former refers to errors that occur between replicate experiments that have been made and conducted in the same way.The latter refers to errors that consistently occur between the true value and the measured value.Within the proposed framework, tests are categorized according to this systematic error.The first source of error considered is processing conditions, which can for example be the result of handling or compacting of the sample that does not occur in the original process.The second source is process variations, which can occur when the experiment is performed on a non-representative sample of material that is for example caused by a variation in composition due to dosing fluctuations or a variation in material temperature due to the heating up of the system.Other sources of systematic error, such as wrong calibration of sensors, are not considered in the framework as they do not lead to a meaningful categorization.

Off-line, on-line, in-line, in-situ
Off-line measurements are conducted separately from the process, and on-line measurements are conducted using the process itself.Both types do not measure the same material that is in the final object since the material is prepared separately or extracted from the process.In-line and in-situ measurements are conducted on the processing setup and measure the same material that is or will be in the object.The former refers to measurements on the processing line, such as pressure, temperature, and flow sensors, while the latter refers to measurements on the object, such as a terrestrial laser scanner or thermal camera.It should be noted that these definitions differ slightly from what is found in the literature [29,30], where, firstly, at-line experiments are defined to be conducted close to the process.Since the only difference between off-line and at-line experiments is the location, and, therefore, any off-line experiment can be an atline experiment as well, this latter category is omitted.Secondly, the term on-line refers in some definitions to experiments that are conducted on a bypass of the processing line.Since this is typically not applicable to concrete processing, on-line has been redefined.

Quality control of fresh state mechanical performance
The framework is applied to the quality control for fresh-state mechanical properties for DFC.The resulting scheme is shown in Fig. 2.
An extensive list of experiments can be categorized as destructive and off-line with a relatively high systematic error since the processing conditions and material differ from those in the fabricated object.For example, due to the compaction of a sample that is taken from a process, or due to the handling time in between sampling and testing.This category includes compression tests (unconfined [31,32], squeeze flow [33], triaxial [34]), tensile test [32,33,35], slump [36], ram extrusion [37][38][39], and shear tests (direct shear [31,40], vane [41], and constant shear rheometry [33,42,43]).Since the geometry and loading conditions of these tests vary, detailed material behavior can be studied, for example thixotropic, viscous, and pressure-dependent behavior.Furthermore, these tests are not direct replacements for one another and, therefore, one might use a combination of these tests to get a complete insight into the behavior.For these experiments, multiple individual samples are needed to measure the evolution and process variations, and the sampling rate per testing setup is typically low (~10 -3 Hz).At the top-right of the scheme, non-destructive off-line tests with, again, a relatively high systematic error are categorized.These are penetration tests (for example slow cone penetration, Vicat needle) [36,[44][45][46][47], ultrasonic measurements [47][48][49], and oscillatory rheometers [50][51][52][53].
In these setups, the samples are preserved and measurements can be done continuously over time, reducing the number of required samples by one order of magnitude compared to the previous category when the evolution has to be measured at a high resolution over a large domain.To measure process variations, however, separate samples have to be taken.These tests have the potential to be implemented in-line if the samples remain in the process line.
To the best of the authors' knowledge, this has not yet been developed and therefore the tests are all categorized as offline.
In the destructive, on-line category two groups of tests are identified.The first group can be labeled as near-nozzle tests, currently consisting of near-nozzle droplet formation ("Slugs test"), cantilever beam test, and the on-line gravity-induced compression test [35,[54][55][56][57][58][59].The sampling rate of these tests is significantly higher than that of the destructive off-line test (~1 Hz) and can therefore continuously measure process variations.The timescale at which the tests can be performed is however limited to the first seconds after extrusion at the most.The second group of on-line destructive tests consists of production trials.With these tests, a sample is produced until failure, and the material properties are reverseengineered, which requires an accurate model of the object.So far, this has only been done for 3D concrete printing [16], but it could be extended to other DFC methods as well.The number of required samples for these tests is high, since evolution and process variations are measured with separate samples, and the sampling rate is low.The tests do, however, capture all of the processing conditions, since it is the process itself that is used to produce the sample and the tests can be performed in a highly automated manner.These experiments do not, however, measure the same material as the material in the final object and, therefore, do not have the lowest possible systematic error.This can be relevant if process variations occur.
In the non-destructive section, two groups of tests are found that have the lowest systematic error.Firstly, in-line sensors and measurements such as pressure sensors, flow sensors, temperature sensors, tracer experiments [22], as well as torque and frequency measurements of motors [37].These can measure process variations at high frequency (~10 2 Hz) and can be correlated to destructive properties to function as in-line non-destructive measurements of the mechanical properties directly after extrusion.Correlation to the development over time might be possible for the initial evolution but is limited since this is an extrapolation.
Secondly, sensors such as laser line profile scanners [8,60], computer vision [9], and (near) infrared sensors [60] can measure deformations and surface properties at a high resolution on the object itself, which can be correlated to the fresh state mechanical performance properties.Depending on the positioning of these sensors, they can be used to measure the evolution and process variations at high resolution.

Quality control strategies
From the framework, a road map for quality control strategies is derived.These strategies specifically target the measurement of evolution (EV), measurement of process variations (PV), and reduction of systematic error due to processing conditions (PC).The road map is schematically illustrated in Fig. 3.
Fig. 3 A road map for quality control strategies.Strategy EV can be used to measure evolution, PC to measure under representative processing conditions, and PV to measure process variations.
Quality control strategy 0 exclusively uses off-line destructive experiments and can be applied when the evolution is to be measured at only a few points in time, the expected process variations and the sensitivity to it are minimal, and off-line measurements accurately represent the process conditions.Examples of production processes that can follow this strategy are prefab and on-site casting of concrete elements with self-compacting concrete.
Quality control strategy EV uses off-line or in-situ nondestructive experiments to measure the evolution continuously over a large domain.A correlation is required to off-line destructive experiments when the property of interest is destructive by definition, such as the yield stress, and a truly non-destructive strategy therefore cannot exist.
Applying quality control strategy EV is beneficial if this correlation is already present or if the domain of evolution is so broad that the reduction of labor outweighs the labor that is required for the correlation.Most DFC methods require evolution properties over a large domain and can therefore make use of this strategy.When the process variations are small, the object and material are insensitive to process variations and processing conditions, it can be sufficient to use the off-line correlation.Otherwise, expanding the strategy with in-situ measurements and potentially combining it with strategy PV is required.The first case applies, for example, to course-resolution objects like on-site printed buildings that are produced in large-volume mixing chambers, and the second case applies to fine-resolution objects [61] that are produced with systems that have a small volume or high volumetric flow rate.
Quality control strategy PV uses non-destructive in-line and the initial measurement of in-situ sensors that are correlated to destructive on-line experiments to measure process variations.If these on-line experiments are not available for the property of interest, the correlation can also be drawn with off-line destructive measurements.The strategy is suitable when process variations can occur and the material and object are sensitive to it.An example of a digital fabrication method that can use this strategy is fineresolution 3D concrete printing [61].
Quality control strategy PC uses on-line destructive experiments to measure under the correct processing conditions.Correlations with off-line destructive experiments can be used to indirectly measure properties for which on-line experiments are not available.For example, by measuring the yield stress on-line and off-line, and measuring the stiffness only off-line, the on-line stiffness modulus can be estimated.Quality control strategy PC is firstly suitable when materials have a highly time-dependent behavior, such as accelerated materials that are applied in additive manufacturing [62,63] and Smart dynamic casting [13].Secondly, it is suitable for materials that have a consistency that makes them sensitive to externally applied conditions, which is for example the case for many materials that rely on colloidal interaction for shape stability and initial buildability.

Conclusion and research gaps
In this paper, a framework is presented that allows for the systematic categorization of quality control experiments for DFC.The control strategies that are derived from it, show that there is a need for destructive on-line and non-destructive off-line, in-line, and in-situ experiments to measure evolution and process variations, and to control the quality under representative processing conditions.Off-line destructive experiments are an essential part of the advanced strategies as well, either to correlate destructive and non-destructive measurements or when the property can only be measured in an off-line or destructive setting.To allow for the complete adoption of these strategies, the following research gaps can be identified.
First, the experiments and sensors that have recently been developed by the DFC research community, mostly belonging to the destructive on-line and non-destructive off-line and inline categories, need standardization.These experiments and sensors have not yet been adopted in codes and standards.Since they are a crucial part of all of the advanced quality control strategies, this is a key threshold for adoption.
Second, the applicable domain (loading conditions and material properties) of existing experiments, falling in the offline destructive category, needs to be expanded to allow for the materials that are used in DFC.Since the strength and stiffness of materials used in DFC fall in between those of cast fresh and hardened concrete [19], the testing equipment needs to be adapted accordingly.This can include the loading device, load cell, applied geometry, and underlying materialrelated assumptions for the analysis of the results.
Third, off-line destructive experiments often have the potential to be applied on-line, and off-line non-destructive experiments have the potential to be used in-line or in-situ, which allows for a reduction in systematic error.To enable this, the standardized geometries of the test setups might have to be adapted to suit the production method, which in turn requires the development of (numerical) analyses with which the effect of geometry can be studied.
Fourth, robust correlations need to be determined between destructive and non-destructive experiments.These robust correlations are a key enabler for all of the detailed quality control strategies and if these are missing it could be a threshold for the adoption of DFC with optimized materials, processes, and designs in the long term.To enable this, a large-scale collaboration between laboratories and DFC facilities is required to make these correlations widely applicable to many materials and processes.

Fig. 2 A
Fig.2A framework for quality control of fresh state mechanical performance of digitally fabricated concrete.Rectangular boxes indicate how evolution (EV) and process variations (PV) can be measured, where dots refer to individual samples, a line refers to a continuous measurement, and 0 refers to a measurement at or around the initial value of the evolution.