As for the OP, I don’t know of any existing work along exactly these lines, but that doesn’t mean it’s not a potentially fruitful approach, or at least a useful conceptual organization. (In the last blog linked to, a commenter raises the objection that “you’re not going to get much more than new terms for complexity concepts”, but this correspondence of concepts across disciplines can be valuable in itself. One might just as well declare that studying arithmetic with the tools of group theory is simply a renaming of terms, but to be dismissive is to ignore the manner in which these shifts in perspective bring to sight the applicability of general results and machinery not previously immediately visible as relevant…).
[As a further slight nitpick, the commenter seems to have in mind a category where any two reductions between the same languages are considered equal (as evidenced by, for example, his description of languages in P as initial objects and NP-complete ones as terminal objects). But, the more interesting thing, I would think, is to look at a category which has not already gone through such preorder reflection. Alternatively, perhaps they simply meant “weak initial/terminal”].
Anyway, when starting off, I see no need to limit ourselves prematurely to the NP-hard languages. We might well first investigate the category whose objects are arbitrary languages and whose morphisms are arbitrary Turing reductions. (If we liked, we could investigate arbitrary problems as well, not just decision problems). This is the obvious categorification of computability theory (aka, recursion theory), with the poset reflection of this category being the Turing degrees. By then limiting ourselves to Cook or Karp reductions and/or NP-complete or NP-hard languages, we are simply extracting certain subcategories of interest. Off the top of my head, I can’t think of anything particularly intelligent to say about the relations between these, though I’m sure there are some adjunctions and so forth to be found.
But, as ftg points out, any proof one way or another regarding P vs. NP would have to get around the relativization problem, so one will have to find some property not preserved under reconstructing these categories relative to an arbitrary oracle in order to be able to conclude anything on that front. Still, regardless of whether such a fundamental insight is itself category-theoretic or comes externally, the framework can be helpful to structure and clarify the work which needs to be done (using the previous analogy, this would be the way one might invoke groups to present work in arithmetic even when one’s argument largely depends on fundamental insights from outside group theory; groups, categories, and so forth are conceptual tools of ubiquitous application, not merely to be invoked when unavoidably necessary).
If one is not necessarily interested in this category in particular but in connections between category theory and complexity theory more generally, one perhaps more straightforward thing to do is to investigate the category whose objects are “data types” and morphisms are computable functions between them, and its relation to its subcategory containing only polytime functions (whose output sizes are also polynomially bounded, to ensure closure under composition). Under the Curry-Howard isomorphism, we should expect such categories to correspond to some logic; I think the line of work including “bounded linear logic” and “light linear logic” is probably the one to investigate here.
In a different vein of connections between category theory, complexity theory, and logic, I also have several papers sitting around on “polynomial-time realizability” which I keep meaning to look into and yet have not gotten around. But Seely’s “Graded Multicategories of Polynomial-Time Realizers” hits all the right buzzwords. 