This paper considers estimation and inference using Minimum Divergence (MD) techniques. MD Estimators are obtained by minimizing a divergence between the empirical distribution and the distribution implied by the moment restrictions. Important techniques that have received attention as possible alternatives to GMM, such as Empirical Likelihood, Exponential Tilting and Continuous Updating, are special cases of Minimum Divergence estimators. The paper makes contributions as follows. Firstly, it is proven that there is a relationship between the Generalized Empirical Likelihood (GEL) class of estimators and the MD class that extends beyond the known cases. Secondly, a Bayesian interpretation is given of the weighting scheme through which the MD estimators re-weight observations. Thirdly, it is shown that all the members of the MD class that have the same asymptotic bias of Empirical Likelihood are third-order efficient. This result implies that higher order efficiency is an inadequate criterion for prescribing which specific estimator should be used in applied work. It is argued that for selecting from the class of third order efficient estimators, one should consider the boundedness of the influence function. Monte Carlo simulations show that test statistic s based on estimators that are third order efficient and have a bounded influence function outperform tests based on third order efficient estimators with an unbounded influence function