There is a nice Physical Review E article by Marcos Rigol, Tyler Bryant, and Rajiv Singh which considers the application of a new numerical linked cluster algorithm (NLC) method to the t-J model. To put nicely things in context they state
In spite of its simplicity, understanding finite-temperature thermodynamic properties of the t-J model has proven to be a very challenging task. Quantum Monte Carlo simulations suffer from severe sign problems, which become a major difficulty at low temperatures. The two general approaches that have been commonly used to study this model are [exact diagonalisation] ED and [high temperature expansions] HTE. ED studies in which one fully diagonalizes the t-J Hamiltonian are difficult since they can only be done for very small systems, as a consequence of which finite size effects are very large. A more efficient approach to this problem is the finite-temperature Lanczos method (FTLM), which has been developed by Jaklič and Prelovšek (JP). Within this approach the full thermodynamic trace is reduced by randomly sampling the eigenstates of the Hamiltonian. This allows one to study larger systems sizes in an unbiased way, but still finite size effects become relevant as the temperature is lowered.
The outstanding question concerning high temperature expansion (HTE) methods is whether they can give reliable results at the "low" temperatures relevant to experiments. Here it should be stressed that the energy scales t and J are of the order of 1000 K. On the other hand HTE and NLC have the distinct advantage that they are valid for the infinite lattice and do not suffer from finite size effects (a problem for FTLM and ED).
The figure below shows the very encouraging result that the complementary methods NLC and FTLM are in agreement for a calculation of the temperature dependence of the entropy, down to temperatures as low at about 0.1t, and for a range of dopings.
Two earlier posts considered the significance of FTLM results for understanding the metallic phase of cuprates at optimal doping.