RMF EOS emcee workflow

First you need to install emcee package, to do MCMC sampling

[ ]:
pip install emcee

What EMCEE does ?

emcee is an open-source Python library designed for MCMC sampling. Here’s how it operates:

  1. Walkers: emcee uses multiple “walkers” that explore the parameter space. Each walker represents a point in the parameter space and moves around based on the probabilities defined by the log likelihood and log prior.

  2. Sampling Process: During each step, each walker proposes a new position based on its current position and a set of random perturbations. The new position is then accepted or rejected based on the Metropolis-Hastings criterion, which ensures that points with higher probabilities are more likely to be accepted.

  3. Convergence: Over many iterations, the walkers explore the parameter space, allowing for convergence to the posterior distribution of the parameters. After a sufficient number of steps, the samples from the walkers can be used to estimate the posterior distributions and uncertainties of the parameters.

Import necessary libraries

[6]:
import numpy as np
import emcee
import math
import InferenceWorkflow.Likelihood as likelihood
import InferenceWorkflow.prior as prior
import TOVsolver.constant as const
import TOVsolver.main as main
import EOSgenerators.crust_EOS as crust
import EOSgenerators.RMF_EOS as RMF
import scipy.stats as stats

Loading Crust Model Data

[4]:
# Load crust model data
Tolos_crust_out = np.loadtxt('/Users/DELL/CompactOject/Test_Case/Tolos_crust_out.txt', delimiter=None, comments='#', usecols=(0, 1, 2, 3, 4))
eps_crust_out = Tolos_crust_out[:, 3] * const.G / const.c**2
pres_crust_out = Tolos_crust_out[:, 4] * const.G / const.c**4
eps_crust, pres_crust = crust.PolyInterpolate(eps_crust_out, pres_crust_out)

Prior

This function defines the log prior probability for the parameters. The prior is flat within specified bounds. If the parameters fall outside these bounds, the function returns −∞, indicating that the parameters are not valid.

[5]:
# Define the prior functions
def log_prior(params):
    # Unpack parameters
    central_density, other_param = params  # Adjust based on your actual parameters

    # Define prior ranges
    if 0 < central_density < 2.5 and 0 < other_param < 1:  # example ranges
        return 0.0  # Flat prior
    return -np.inf  # Log of zero for out of bounds

# Define the log likelihood function
def log_likelihood(params):
    central_density, other_param = params  # Adjust based on your actual parameters

    # Compute the equation of state
    RMF.compute_EOS(central_density, other_param)  # Add your specific EOS calculations
    mass_radius = main.OutputMR()  # Get the mass-radius output

    # Calculate likelihood based on your data and model
    # For example, if you have observed data `obs_data`:
    obs_data = ...  # Define your observed data
    likelihood_value = likelihood.calculate(mass_radius, obs_data)  # Replace with actual calculation

    return likelihood_value
[ ]:
# Define parameters for the model
parameters = ['g_sigma', 'g_omega', 'g_rho', 'kappa', 'lambda_0', 'zeta', 'Lambda_w', 'd1']
# For two or more MR measurements, define d2 or more depending on complexity
# parameters = ['g_sigma', 'g_omega', 'g_rho', 'kappa', 'lambda_0', 'zeta', 'Lambda_w', 'd1', 'd2']

def prior_transform(cube):
    # Transform cube values into physical parameters
    params = cube.copy()

    # Normal and flat prior transformations
    params[0] = math.sqrt(prior.normal_Prior(107.5, 7.5, cube[0]))  # g_sigma
    params[1] = math.sqrt(prior.flat_prior(150, 210, cube[1]))  # g_omega
    params[2] = math.sqrt(prior.flat_prior(75, 210, cube[2]))  # g_rho
    params[3] = prior.normal_Prior(2.525 / oneoverfm_MeV, 1.525 / oneoverfm_MeV, cube[3])  # kappa
    params[4] = prior.normal_Prior(0.0045, 0.0205, cube[4])  # lambda_0
    params[5] = prior.flat_prior(0, 0.04, cube[5])  # zeta
    params[6] = prior.flat_prior(0, 0.045, cube[6])  # Lambda_w

    # Constants
    g_sigma = params[0]
    g_omega = params[1]
    g_rho = params[2]
    kappa = params[3]
    lambda_0 = params[4]
    zeta = params[5]
    Lambda_w = params[6]
    m_sig = 495 / oneoverfm_MeV

    # Parameter array for RMF
    theta = np.array([m_sig, m_w, m_rho, g_sigma, g_omega, g_rho, kappa, lambda_0, zeta, Lambda_w])

    # Compute EOS
    ep, pr = RMF.compute_EOS(eps_crust, pres_crust, theta)

    # Stack energy density and pressure
    eps_total = np.hstack((eps_crust, ep))
    pres_total = np.hstack((pres_crust, pr))

    # Initialize lists for mass-radius
    RFSU2R = []
    MFSU2R = []
    density = np.logspace(14.3, 15.6, 50)

    # Check monotonicity of total energy density and pressure
    if all(x < y for x, y in zip(eps_total[:-1], eps_total[1:])) and all(x < y for x, y in zip(pres_total[:-1], pres_total[1:])):
        MR = main.OutputMR('', eps_total, pres_total).T
        if MR[1].size == 0:
            params[7] = 0  # No valid MR
        else:
            for i in range(len(MR[1])):
                RFSU2R.append(MR[0][i])
                MFSU2R.append(MR[1][i])
                if i > 20 and MR[1][i] - MR[1][i - 1] < 0:
                    break

    # Set d1 parameter based on maximum mass
    if len(MFSU2R) == 0:
        params[7] = 0  # No valid MR
    else:
        max_index = len(MFSU2R)
        max_d = np.log10(density[max_index - 1])
        params[7] = 14.3 + (max_d - 14.3) * cube[7]  # Transform d1 parameter
    return params

Log Likelihood in EMCEE

This function computes the log likelihood of the model given the parameters. It calls the functions to compute the equation of state and retrieves the mass-radius relation. Then, it calculates the likelihood based on observed data.

[ ]:
# # Define the log likelihood function
# def log_likelihood(params):
#     central_density, other_param = params  # Adjust based on your actual parameters

#     # Compute the equation of state
#     RMF.compute_EOS(central_density, other_param)  # Call the function to compute EOS
#     mass_radius = main.OutputMR()  # Get the mass-radius output

#     # Calculate likelihood based on your data and model
#     obs_data = ...  # Define your observed data
#     likelihood_value = likelihood.calculate(mass_radius, obs_data)  # Replace with actual calculation

#     return likelihood_value

Likelihood for our NS model

[ ]:
# Define the likelihood transformation function
def likelihood_transform(theta):
    eps_crust, pres_crust = crust.PolyInterpolate(eps_crust_out, pres_crust_out)
    g_sigma, g_omega, g_rho, kappa, lambda_0, zeta, Lambda_w, d1 = theta

    # Compute EOS
    m_sig = 495 / const.oneoverfm_MeV
    m_w = 3.96544
    m_rho = 3.86662
    theta1 = np.array([m_sig, m_w, m_rho, g_sigma, g_omega, g_rho, kappa, lambda_0, zeta, Lambda_w])
    ep, pr = RMF.compute_EOS(eps_crust, pres_crust, theta1)

    eps_total = np.hstack((eps_crust, ep))
    pres_total = np.hstack((pres_crust, pr))

    # Compute MR likelihood from simulated MR measurement
    probMRgaussian = likelihood.MRlikihood_Gaussian(eps_total, pres_total, (1.4, 13, 0.07, 0.65), d1)

    # Gravitational wave event likelihood
    GW170817 = np.load('GW170817_McQL1L2weights.npy')
    chrip170817 = stats.gaussian_kde(GW170817[:, 0], weights=GW170817[:, 4])
    kernelGW = stats.gaussian_kde(GW170817.T[0:4], weights=GW170817[:, 4])
    probGW = likelihood.TidalLikihood_kernel(eps_total, pres_total, (kernelGW, chrip170817), d1)

    # Combine all likelihoods
    prob = probMRgaussian + probGW
    return prob

Random Walker and log Probability Function

[ ]:
# Define the log probability function
def log_probability(params):
    lp = log_prior(params)
    if not np.isfinite(lp):
        return -np.inf
    return lp + log_likelihood(params)

Setting up the sampler

This block initializes the MCMC sampler. It defines the number of walkers (simultaneous sampling processes) and the number of parameters being estimated. It generates random initial positions for each walker within specified bounds.

[ ]:
# Set up the sampler
nwalkers = 50  # Number of walkers
ndim = 2  # Number of dimensions (parameters)
p0 = [np.random.rand(ndim) * [2.5, 1] for _ in range(nwalkers)]  # Initial positions of walkers
[ ]:
# Define the log probability function for emcee
def log_probability(theta):
    lp = prior_transform(theta)  # Get prior parameters
    if not np.isfinite(lp).all():
        return -np.inf
    return likelihood_transform(lp)

# Set up the MCMC sampler
nwalkers = 50  # Number of walkers
ndim = 8  # Number of dimensions (parameters)
p0 = [prior_transform(np.random.rand(ndim)) for _ in range(nwalkers)]  # Initial positions of walkers

# Initialize the emcee sampler
sampler = emcee.EnsembleSampler(nwalkers, ndim, log_probability)

# Analyze the results
samples = sampler.get_chain(flat=True)
np.save('mcmc_samples.npy', samples)  # Save the samples for later analysis

# Example: Compute and print parameter estimates
parameter_estimates = np.mean(samples, axis=0)
for i, param in enumerate(['g_sigma', 'g_omega', 'g_rho', 'kappa', 'lambda_0', 'zeta', 'Lambda_w', 'd1']):
    print(f"Estimated {param}: {parameter_estimates[i]}")

Initialize and Run

Here, the EnsembleSampler from emcee is created with the defined log probability function. The MCMC sampling is executed for a specified number of steps, allowing progress tracking.

[ ]:
# Initialize the emcee sampler
sampler = emcee.EnsembleSampler(nwalkers, ndim, log_probability)

# Run the MCMC sampling
nsteps = 1000  # Number of steps
sampler.run_mcmc(p0, nsteps, progress=True)

For our NS Model

Results

[ ]:
# # Analyze the results
# samples = sampler.get_chain(flat=True)
# np.save('mcmc_samples.npy', samples)  # Save the samples for later analysis

# # Example: Compute and print parameter estimates
# central_density_estimate = np.mean(samples[:, 0])
# other_param_estimate = np.mean(samples[:, 1])
# print(f"Estimated Central Density: {central_density_estimate}")
# print(f"Estimated Other Parameter: {other_param_estimate}")


# Analyze the results
samples = sampler.get_chain(flat=True)
np.save('mcmc_samples.npy', samples)  # Save the samples for later analysis

# Example: Compute and print parameter estimates
parameter_estimates = np.mean(samples, axis=0)
for i, param in enumerate(['g_sigma', 'g_omega', 'g_rho', 'kappa', 'lambda_0', 'zeta', 'Lambda_w', 'd1']):
    print(f"Estimated {param}: {parameter_estimates[i]}")

[ ]:
figure = corner.corner(samples, labels=labels,
                       smooth=0.9,
                       label_kwargs=dict(fontsize=22),
                       title_kwargs=dict(fontsize=22),
                       quantiles=[0.16, 0.5, 0.84],
                       levels=(1 - np.exp(-0.5), 1 - np.exp(-2), 1 - np.exp(-9 / 2.)),
                       plot_density=False,
                       plot_datapoints=False,
                       fill_contours=True,
                       show_titles=True,
                       max_n_ticks=3,
                       title_fmt='.2f')