If a physics model is not deductive, it is an empirical compilation, according to the following definitions of Einstein:
https://www.marxists.org/reference/archive/einstein/works/1910s/relative/ap03.htm
Albert Einstein: "From a systematic theoretical point of view, we may imagine the process of evolution of an empirical science to be a continuous process of induction. Theories are evolved and are expressed in short compass as statements of a large number of individual observations in the form of empirical laws, from which the general laws can be ascertained by comparison. Regarded in this way, the development of a science bears some resemblance to the compilation of a classified catalogue. It is, as it were, a purely empirical enterprise. But this point of view by no means embraces the whole of the actual process ; for it slurs over the important part played by intuition and deductive thought in the development of an exact science. As soon as a science has emerged from its initial stages, theoretical advances are no longer achieved merely by a process of arrangement. Guided by empirical data, the investigator rather develops a system of thought which, in general, is built up logically from a small number of fundamental assumptions, the so-called axioms."
Even a single fudge factor (not deduced from postulates) is enough to show that the "theory" is a not-even-wrong empirical concoction:
"A fudge factor is an ad hoc quantity introduced into a calculation, formula or model in order to make it fit observations or expectations. Examples include Einstein's Cosmological Constant..."
https://en.wikipedia.org/wiki/Fudge_factor
"In 1916 Einstein found what he considered a glitch in his new theory of general relativity. His equations showed that the contents of the universe should be moving - either expanding or contracting. But at the time, the universe seemed the very definition of stasis. All the data, facts, and phenomena known in the early 1900s said that the Milky Way was the cosmos itself and that its stars moved slowly, if at all. Einstein had presented the definitive version of the general theory of relativity to the Prussian Academy of Sciences the previous year, and he was not inclined to retract it. So he invented a fudge factor, called lambda, that could function mathematically to hold the universe at a standstill. [...] Lambda, also known as the cosmological constant, has come in handy of late."
http://discovermagazine.com/2004/sep/the-masters-mistakes/
Ken Croswell, Magnificent Universe, p. 179: "Ever since, the cosmological constant has lived in infamy, a fudge factor concocted merely to make theory agree with observation."
http://www.amazon.com/Magnificent-Universe-Ken-Croswell/dp/0684845946
In order to be consistent with dark matter, general relativity needs four fudge factors:
"Verlinde's calculations fit the new study's observations without resorting to free parameters – essentially values that can be tweaked at will to make theory and observation match. By contrast, says Brouwer, conventional dark matter models need four free parameters to be adjusted to explain the data."
https://www.newscientist.com/article/2116446-first-test-of-rival-to-einsteins-gravity-kills-off-dark-matter/
How many fudge factors LIGO conspirators needed in order to model the nonexistent gravitational waves is a deep mystery:
"Cornell professors Saul Teukolsky, astrophysics, and Larry Kidder, astronomy, played an instrumental role in the first detection of gravitational waves, a century after Albert Einstein predicted their existence in his theory of general relativity. [...] The LIGO and Virgo group confirmed that these gravitational waves had come from the collision of black holes by comparing their data with a theoretical model developed at Cornell. Teukolsky and the Cornell-founded Simulation of eXtreme Spacetimes collaboration group have been developing this model since 2000, according to the University."
http://cornellsun.com/2016/02/10/cornell-scientists-validate-einsteins-theory-of-relativity/
Non-deductive models in physics are essentially equivalent to the "empirical models" defined here:
http://collum.chem.cornell.edu/documents/Intro_Curve_Fitting.pdf
"The objective of curve fitting is to theoretically describe experimental data with a model (function or equation) and to find the parameters associated with this model. Models of primary importance to us are mechanistic models. Mechanistic models are specifically formulated to provide insight into a chemical, biological, or physical process that is thought to govern the phenomenon under study. Parameters derived from mechanistic models are quantitative estimates of real system properties (rate constants, dissociation constants, catalytic velocities etc.). It is important to distinguish mechanistic models from empirical models that are mathematical functions formulated to fit a particular curve but whose parameters do not necessarily correspond to a biological, chemical or physical property."
Pentcho Valev