This is to some degree a software and to some degree a purely stats question. I have a design matrix $X$ with categorial and continuous variables. The first column contains only ones. For a given vector $y$ that has as many elements as $X$ has rows, I can compute the OLS for $\beta$ in the equation $y = X\beta + u$ by $$\hat\beta = (X'X)^{-1}(X'y).$$ In R I could use lm(y ~ ., cbind(y, X)). My goal is to compute type 3 sum of squares. However, for type 3 sum of squares it is recommended that continuous predictors are mean centered, while categorial predictors are coded with orthogonal contrasts. My question is now: how can I transform $X$ to satisfy these requirements? Of course, I could simply transform my $X$ matrix in a separate step. This feels not very efficient though. So I was wondering if there exists a clever transformation matrix $T$ (that can easily be obtained from thelm object) such that $TX$ gives me what I want, or do they exist R functions that do exactly this?
My current solution is an R function that accepts a linear model as argument, extracts the data, the continuous predictors to mean center them, and extracts the categorial predictors to apply orthogonal contrast coding. Finally, I estimate another linear model based on the transformed data and compute the type 3 sum of squres based on this new model. As mentioned above, this feels very inefficient. My hope is that there are better solutions to this problem.