- Subspace Embedding and Linear Regression with Orlicz Norm(arXiv)
Creator : Alexandr Andoni, Chengyu Lin, Ying Sheng, Peilin Zhong, Ruiqi Zhong
Summary : We contemplate a generalization of the traditional linear regression downside to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a non-negative convex operate G:R+→R+ with G(0)=0: the Orlicz norm of a vector x∈Rn is outlined as ∥x∥G=inf/α)≤1. We contemplate the circumstances the place the operate G(⋅) grows subquadratically. Our primary result’s based mostly on a brand new oblivious embedding which embeds the column area of a given matrix A∈Rn×d with Orlicz norm right into a decrease dimensional area with ℓ2 norm. Particularly, we present the right way to effectively discover an embedding matrix S∈Rm×n,m<n such that ∀x∈Rd,Ω(1/(dlogn))⋅∥Ax∥G≤∥SAx∥2≤O(d2logn)⋅∥Ax∥G. By making use of this subspace embedding approach, we present an approximation algorithm for the regression downside minx∈Rd∥Ax−b∥G, as much as a O(dlog2n) issue. As an extra software of our methods, we present the right way to additionally use them to enhance on the algorithm for the ℓp low rank matrix approximation downside for 1≤p<2