一、HQL代码的构建。
(1)首先将hibernate中的src目录下的代码解压。
(2)安装配置好antlr。
(3)把grammar目录下的三个.g文件(hql.g,hql-sql.g,sql-gen.g)解压到一个目录,然后从命令行进入此目录,依次运行"java antlr.Tool hql.g","java antlr.Tool hql-sql.g","java antlr.Tool sql-gen.g",将生成的java代码拷贝到源代码的org.hibernate.hql.antlr下。
二、让我们从QueryTranslatorImpl开始分析,当调用session.find('...')的时候将会调用 QueryTranslatorImpl的compile方法来解析hql语句为sql语句。compile则主要是调用的doCompile方法。
// PHASE 1 : Parse the HQL into an AST.
HqlParser parser = parse( true );// PHASE 2 : Analyze the HQL AST, and produce an SQL AST.
HqlSqlWalker w = analyze( parser, collectionRole );
sqlAst = ( Statement ) w.getAST();
generate( ( QueryNode ) sqlAst );
queryLoader = new QueryLoader( this, factory, w.getSelectClause() );parse的主要代码:
private HqlParser parse(boolean filter)
HqlParser parser = HqlParser.getInstance( hql );
parser.statement();
AST hqlAst = parser.getAST();
return parser;
}
analyze的主要代码:
private HqlSqlWalker analyze(HqlParser parser, String collectionRole) throws QueryException, RecognitionException {
HqlSqlWalker w = new HqlSqlWalker( this, factory, parser, tokenReplacements, collectionRole );
AST hqlAst = parser.getAST();
w.statement( hqlAst );
return w;
}
generate的主要代码:
private void generate(AST sqlAst) throws QueryException, RecognitionException {
SqlGenerator gen = new SqlGenerator(factory);
gen.statement( sqlAst );
sql = gen.getSQL();
} 可以看到主要是三步:调用parse方法将hql解析成AST,调用analyze根据上一步生成的AST生成SQLAST,调用generate根据上一步生成的SQL AST生成sql语句, 调用w.getSelectClause()就得到sql语句了。
“用parser把ast抽取出来,再用treeparser进行动作的double pass builder模式,解耦了parser和generation,再配合template,是antlr推荐的最佳模式。”-《看Hibernate3如何解释HQL语言》
1、三个语法文件的作用:
“
hibernate的hql grammar文件一共有三个,在/grammar目录下:
1.hql.g 定义token类和parser类,将hql解释成hql的抽象语法树(ast)
2.hql-sql.g 定义tree walker ,将hql ast转化为sql ast,将生成模块与hibernate解耦。
3.sql-gen.g 定义tree walker,从sql ast生成sql
”-《看Hibernate3如何解释HQL语言》
2、
逐步分析:
(1)parse方法:
HqlParser是从hql.g生成的HqlBaseParser继承来的,主要实现了HqlBaseParser定义的几个模版方法,传入的是hql语句,传出的是HQL AST。
(2)analyze:
HqlSqlWalker是从hql-sql.g生成的HqlSqlBaseWalker继承来的,HqlSqlBaseWalker又是从 TreeParser继承的,主要实现了HqlSqlBaseWalker定义的几个模版方法,传入的是HQL AST,传出的是SQL AST。
hql-sql.g比hql.g简单许多了,因为hql-sql.g已经为我们生成了HQL AST了,hql-sql.g需要做的就是把HQL AST组装成强类型的SQL AST,此处大量引用了hql.g中定义的Vocabulary。
比如:
query!
: #( QUERY { beforeStatement( "select", SELECT ); }
// The first phase places the FROM first to make processing the SELECT simpler.
#(SELECT_FROM
f:fromClause
(s:selectClause)?
)
(w:whereClause)?
(g:groupClause)?
(o:orderClause)?
) {
// Antlr note: #x_in refers to the input AST, #x refers to the output AST
#query = #([SELECT,"SELECT"], #s, #f, #w, #g, #o);
beforeStatementCompletion( "select" );
processQuery( #s, #query );
afterStatementCompletion( "select" );
}
;
这里主要就是在调用模版方法拼状强类型的SQL AST(异构AST)。所有的节点都定义在org.hibernate.hql.ast.tree中。
与第一部的HqlParser用户几乎不用写任何代码相反,我们需要完成我们从HqlSqlBaseWalker继承来的HqlSqlWalker的那些模版方法。
(3)generate方法:
遍历SQL AST,输出sql语句。SqlGenerator是从sql-gen.g生成的SqlGeneratorBase继承来的。输入的是SQL AST,输出的是SQL语句。我个人认为这一步也可以通过全部书写代码完成,可能更清晰,更灵活。
selectExpr
: e:selectAtom { out(e); }
| count
| #(CONSTRUCTOR (DOT | IDENT) ( selectColumn )+ )
| methodCall
| aggregate
| c:constant { out(c); }
| arithmeticExpr
| PARAM { out("?"); }
| sn:SQL_NODE { out(sn); }
| { out("("); } selectStatement { out(")"); }
其中的out("(");就是在拼装代码。
(1)首先将hibernate中的src目录下的代码解压。
(2)安装配置好antlr。
(3)把grammar目录下的三个.g文件(hql.g,hql-sql.g,sql-gen.g)解压到一个目录,然后从命令行进入此目录,依次运行"java antlr.Tool hql.g","java antlr.Tool hql-sql.g","java antlr.Tool sql-gen.g",将生成的java代码拷贝到源代码的org.hibernate.hql.antlr下。
二、让我们从QueryTranslatorImpl开始分析,当调用session.find('...')的时候将会调用 QueryTranslatorImpl的compile方法来解析hql语句为sql语句。compile则主要是调用的doCompile方法。
// PHASE 1 : Parse the HQL into an AST.
HqlParser parser = parse( true );// PHASE 2 : Analyze the HQL AST, and produce an SQL AST.
HqlSqlWalker w = analyze( parser, collectionRole );
sqlAst = ( Statement ) w.getAST();
generate( ( QueryNode ) sqlAst );
queryLoader = new QueryLoader( this, factory, w.getSelectClause() );parse的主要代码:
private HqlParser parse(boolean filter)
HqlParser parser = HqlParser.getInstance( hql );
parser.statement();
AST hqlAst = parser.getAST();
return parser;
}
analyze的主要代码:
private HqlSqlWalker analyze(HqlParser parser, String collectionRole) throws QueryException, RecognitionException {
HqlSqlWalker w = new HqlSqlWalker( this, factory, parser, tokenReplacements, collectionRole );
AST hqlAst = parser.getAST();
w.statement( hqlAst );
return w;
}
generate的主要代码:
private void generate(AST sqlAst) throws QueryException, RecognitionException {
SqlGenerator gen = new SqlGenerator(factory);
gen.statement( sqlAst );
sql = gen.getSQL();
} 可以看到主要是三步:调用parse方法将hql解析成AST,调用analyze根据上一步生成的AST生成SQLAST,调用generate根据上一步生成的SQL AST生成sql语句, 调用w.getSelectClause()就得到sql语句了。
“用parser把ast抽取出来,再用treeparser进行动作的double pass builder模式,解耦了parser和generation,再配合template,是antlr推荐的最佳模式。”-《看Hibernate3如何解释HQL语言》
1、三个语法文件的作用:
“
hibernate的hql grammar文件一共有三个,在/grammar目录下:
1.hql.g 定义token类和parser类,将hql解释成hql的抽象语法树(ast)
2.hql-sql.g 定义tree walker ,将hql ast转化为sql ast,将生成模块与hibernate解耦。
3.sql-gen.g 定义tree walker,从sql ast生成sql
”-《看Hibernate3如何解释HQL语言》
2、
逐步分析:
(1)parse方法:
HqlParser是从hql.g生成的HqlBaseParser继承来的,主要实现了HqlBaseParser定义的几个模版方法,传入的是hql语句,传出的是HQL AST。
(2)analyze:
HqlSqlWalker是从hql-sql.g生成的HqlSqlBaseWalker继承来的,HqlSqlBaseWalker又是从 TreeParser继承的,主要实现了HqlSqlBaseWalker定义的几个模版方法,传入的是HQL AST,传出的是SQL AST。
hql-sql.g比hql.g简单许多了,因为hql-sql.g已经为我们生成了HQL AST了,hql-sql.g需要做的就是把HQL AST组装成强类型的SQL AST,此处大量引用了hql.g中定义的Vocabulary。
比如:
query!
: #( QUERY { beforeStatement( "select", SELECT ); }
// The first phase places the FROM first to make processing the SELECT simpler.
#(SELECT_FROM
f:fromClause
(s:selectClause)?
)
(w:whereClause)?
(g:groupClause)?
(o:orderClause)?
) {
// Antlr note: #x_in refers to the input AST, #x refers to the output AST
#query = #([SELECT,"SELECT"], #s, #f, #w, #g, #o);
beforeStatementCompletion( "select" );
processQuery( #s, #query );
afterStatementCompletion( "select" );
}
;
这里主要就是在调用模版方法拼状强类型的SQL AST(异构AST)。所有的节点都定义在org.hibernate.hql.ast.tree中。
与第一部的HqlParser用户几乎不用写任何代码相反,我们需要完成我们从HqlSqlBaseWalker继承来的HqlSqlWalker的那些模版方法。
(3)generate方法:
遍历SQL AST,输出sql语句。SqlGenerator是从sql-gen.g生成的SqlGeneratorBase继承来的。输入的是SQL AST,输出的是SQL语句。我个人认为这一步也可以通过全部书写代码完成,可能更清晰,更灵活。
selectExpr
: e:selectAtom { out(e); }
| count
| #(CONSTRUCTOR (DOT | IDENT) ( selectColumn )+ )
| methodCall
| aggregate
| c:constant { out(c); }
| arithmeticExpr
| PARAM { out("?"); }
| sn:SQL_NODE { out(sn); }
| { out("("); } selectStatement { out(")"); }
其中的out("(");就是在拼装代码。
解决方案 »
- struts2的启动问题
- 基于tomcat写的一个下载的功能,ie正常,firefox不正常
- 真正的技术问题(dom转String)&高手来&比较经验
- 求助,关于jQuery.UI.Dialog问题
- struts+hibernate问题。。急
- EJB对数据库的访问机制是什么? Hibernate其他功能?(除了ORM)
- 用java实现控制打印机打印
- 大家来帮我解决一个关于类装载的问题(报:Exception in thread "main" java.lang.SecurityException)
- Tomcat的安装问题,提示NO java Virtual Machine Found,
- 高分求教,mxue是什么东东?
- wtc的问题
- 新手SSH架构开发疑问
public final String ORDER_DESC = "DESC";
public QueryBuilder(String alias){
this.alias = alias;
} public String alias = ""; public boolean countMode = false; public QueryBuilder setCountMode(boolean countMode) {
this.countMode = countMode;
return this;
} public boolean pageMode = false;
public int firstResult = -1;
public int maxResult = -1; public QueryBuilder setPage(int firstResult, int maxResult){
if (firstResult<0) firstResult=0;
if(maxResult>0 && firstResult>=0){
pageMode = true;
this.firstResult = firstResult;
this.maxResult = maxResult;
}
return this;
}
public ArrayList<String> fromTable = new ArrayList<String>();
public ArrayList<String> fromAlias = new ArrayList<String>();
public ArrayList<Boolean> fromReturnBack = new ArrayList<Boolean>(); public QueryBuilder addFrom(String ObjectName,String alias,boolean returnBack){
this.fromTable.add(ObjectName);
this.fromAlias.add(alias);
this.fromReturnBack.add(new Boolean(returnBack));
return this;
} public ArrayList<String> orderColumn = new ArrayList<String>();
public ArrayList<String> orderMode = new ArrayList<String>(); public QueryBuilder addOrderASC(String orderColumn){
this.orderColumn.add(orderColumn);
this.orderMode.add(this.ORDER_ASC);
return this;
} public QueryBuilder addOrderDESC(String orderColumn){
this.orderColumn.add(orderColumn);
this.orderMode.add(this.ORDER_DESC);
return this;
} public ArrayList<String> groupColumn = new ArrayList<String>(); public QueryBuilder addGroup(String groupColumn){
this.groupColumn.add(groupColumn);
return this;
} public ArrayList<Criterion> criterion = new ArrayList<Criterion>(); public QueryBuilder addExpression(Criterion e){
this.criterion.add(e);
return this;
}
}
这段代码,哪位帮忙解释一下
import java.util.*;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import org.apache.log4j.Logger;
import org.hibernate.*;
import org.springframework.orm.hibernate3.support.*;
import com.ccgj.common.entity.Entity;
import com.ccgj.common.model.EntityDAO;public abstract class EntityDAOImpl extends HibernateDaoSupport implements EntityDAO{
static Logger logger = Logger.getLogger("com.cnt.common.model.EntityDAOImpl"); abstract public Class getEntityClass() throws Exception; public Entity getNewEntity() throws Exception{
return (Entity)getEntityClass().newInstance();
}
public void save(Entity bean) throws Exception {
if(bean==null){ throw new Exception("Common_DOObjectIsNull"); }
this.getHibernateTemplate().save(bean);
} public void update(Entity bean) throws Exception {
if(bean==null){
throw new Exception("Common_DOObjectIsNull");
}
if(bean.getId()==null){
throw new Exception("Common_DOObjectIDIsNull");
}
if(bean==null){
throw new Exception("Common_DOObjectIsNull");
}
this.getHibernateTemplate().update(bean);
} public void saveOrUpdate(Entity bean) throws Exception {
if(bean==null){ throw new Exception("Common_DOObjectIsNull"); }
this.getHibernateTemplate().saveOrUpdate(bean);
} public void delete(Entity bean) throws Exception {
if(bean==null){ throw new Exception("Common_DOObjectIsNull"); }
if(bean.getId()==null){ throw new Exception("Common_DOObjectIDIsNull"); }
this.getHibernateTemplate().delete(bean);
} public Entity searchByID(Long ID) throws Exception {
if(ID==null){ throw new Exception("Common_DOObjectIDIsNull"); }
Entity bean = null;
Session session = getSessionFactory().openSession();
try {
bean = (Entity) session.get(getEntityClass(),ID);
session.close();
}catch (Exception ex) {
try {
session.close();
}catch (Exception e) {
e.printStackTrace();
}
throw ex;
}
return bean;
}
public List searchByQuery(QueryBuilder query) throws Exception{
StringBuffer sql = new StringBuffer();
//��������ʽ,֧�ַ��ض�����
if(query.countMode){
sql.append("SELECT COUNT(*)");
}else{
sql.append("SELECT ").append(query.alias);
for (int i = 0; i < query.fromAlias.size(); i++) {
Boolean returnBack =(Boolean) query.fromReturnBack.get(i);
if(returnBack.booleanValue()){
sql.append(",").append(query.fromAlias.get(i));
}
}
}
//ƴFrom,֧�ֶ����
sql.append(" FROM ").append(getEntityClass().getName()).append(" ").append(query.alias);
for (int i = 0; i < query.fromTable.size(); i++) {
sql.append(",").append(query.fromTable.get(i)).append(" ").append(query.fromAlias.get(i));
}
StringBuffer log = new StringBuffer().append(sql); //ƴWhere
StringBuffer sqltemp = new StringBuffer();
StringBuffer logtemp = new StringBuffer();
ArrayList<Object> valus = new ArrayList<Object>();
//ƴǰn-1����ʽ
for (int i = 0; i < query.criterion.size()-1; i++) {
Criterion c = (Criterion)query.criterion.get(i);
if(BeanExpression.class.isInstance(c)){
BeanExpression e = (BeanExpression)c;
//�ص����ʵ�����еķ���
genBeanExpressionSQL(e,query.alias);
}
//ÿ����ʽ֮����AND
sqltemp.append(c.toSqlString()).append(" AND ");
logtemp.append(c.toLogString()).append(" AND ");
for (int j = 0; j < c.getValues().length; j++) {
valus.add(c.getValues()[j]);
}
}
//ƴǰ���һ����ʽ
if(query.criterion.size()>0){
Criterion c = (Criterion)query.criterion.get(query.criterion.size()-1);
if(BeanExpression.class.isInstance(c)){
BeanExpression e = (BeanExpression)c;
genBeanExpressionSQL(e,query.alias);
}
sqltemp.append(c.toSqlString());
logtemp.append(c.toLogString());
for (int j = 0; j < c.getValues().length; j++) {
valus.add(c.getValues()[j]);
}
}
if(sqltemp.length()>0){
sql.append(" WHERE ").append(sqltemp);
log.append(" WHERE ").append(logtemp);
} //ƴGroup by,֧�ֶ��
if(query.groupColumn.size()>0){
sql.append(" GROUP BY ");
log.append(" GROUP BY ");
}
for (int i = 0; i < query.groupColumn.size()-1; i++) {
sql.append(query.groupColumn.get(i)).append(",");
log.append(query.groupColumn.get(i)).append(",");
}
if(query.groupColumn.size()>0){
sql.append(query.groupColumn.get(query.groupColumn.size()-1));
log.append(query.groupColumn.get(query.groupColumn.size()-1));
} //ƴOrder by,֧�ֶ��
if(query.orderColumn.size()>0){
sql.append(" ORDER BY ");
log.append(" ORDER BY ");
}
for (int i = 0; i < query.orderColumn.size()-1; i++) {
sql.append(query.orderColumn.get(i)).append(" ").append(query.orderMode.get(i)).append(",");
log.append(query.orderColumn.get(i)).append(" ").append(query.orderMode.get(i)).append(",");
}
if(query.orderColumn.size()>0){
sql.append(query.orderColumn.get(query.orderColumn.size()-1))
.append(" ").append(query.orderMode.get(query.orderColumn.size()-1));
log.append(query.orderColumn.get(query.orderColumn.size()-1))
.append(" ").append(query.orderMode.get(query.orderColumn.size()-1));
} logger.debug(new StringBuffer("DAOImpl genSearchSQL:").append(log).toString());
//ִ��SQL
Session session = getSessionFactory().openSession();
List result = null;
try {
Query hquery = session.createQuery(sql.toString());
for (int i = 0; i < valus.size(); i++) {
if(valus.get(i)==null){
throw new Exception("Common_QureyValueIsNull");
}
hquery.setParameter(i,valus.get(i));
}
if(!query.countMode && query.pageMode){
hquery.setFirstResult(query.firstResult);
hquery.setMaxResults(query.maxResult);
}
result = hquery.list();
session.close();
}catch (Exception ex) {
try {
session.close();
}catch (Exception e) {
e.printStackTrace();
}
throw ex;
}
if(query.countMode){
return result;
}
return result;//do2bean(result);
}
StringBuffer sql = new StringBuffer();
//��������ʽ,֧�ַ��ض�����
sql.append("SELECT COUNT(*)");
if(sumColumns != null) sql.append(",").append(sumColumns);
//ƴFrom,֧�ֶ����
sql.append(" FROM ").append(getEntityClass().getName()).append(" ").append(query.alias);
for (int i = 0; i < query.fromTable.size(); i++) {
sql.append(",").append(query.fromTable.get(i)).append(" ").append(query.fromAlias.get(i));
}
StringBuffer log = new StringBuffer().append(sql); //ƴWhere
StringBuffer sqltemp = new StringBuffer();
StringBuffer logtemp = new StringBuffer();
ArrayList<Object> valus = new ArrayList<Object>();
//ƴǰn-1����ʽ
for (int i = 0; i < query.criterion.size()-1; i++) {
Criterion c = (Criterion)query.criterion.get(i);
if(BeanExpression.class.isInstance(c)){
BeanExpression e = (BeanExpression)c;
//�ص����ʵ�����еķ���
genBeanExpressionSQL(e,query.alias);
}
//ÿ����ʽ֮����AND
sqltemp.append(c.toSqlString()).append(" AND ");
logtemp.append(c.toLogString()).append(" AND ");
for (int j = 0; j < c.getValues().length; j++) {
valus.add(c.getValues()[j]);
}
}
//ƴǰ���һ����ʽ
if(query.criterion.size()>0){
Criterion c = (Criterion)query.criterion.get(query.criterion.size()-1);
if(BeanExpression.class.isInstance(c)){
BeanExpression e = (BeanExpression)c;
genBeanExpressionSQL(e,query.alias);
}
sqltemp.append(c.toSqlString());
logtemp.append(c.toLogString());
for (int j = 0; j < c.getValues().length; j++) {
valus.add(c.getValues()[j]);
}
}
if(sqltemp.length()>0){
sql.append(" WHERE ").append(sqltemp);
log.append(" WHERE ").append(logtemp);
} //ƴGroup by,֧�ֶ��
if(query.groupColumn.size()>0){
sql.append(" GROUP BY ");
log.append(" GROUP BY ");
}
for (int i = 0; i < query.groupColumn.size()-1; i++) {
sql.append(query.groupColumn.get(i)).append(",");
log.append(query.groupColumn.get(i)).append(",");
}
if(query.groupColumn.size()>0){
sql.append(query.groupColumn.get(query.groupColumn.size()-1));
log.append(query.groupColumn.get(query.groupColumn.size()-1));
} //ƴOrder by,֧�ֶ��
if(query.orderColumn.size()>0){
sql.append(" ORDER BY ");
log.append(" ORDER BY ");
}
for (int i = 0; i < query.orderColumn.size()-1; i++) {
sql.append(query.orderColumn.get(i)).append(" ").append(query.orderMode.get(i)).append(",");
log.append(query.orderColumn.get(i)).append(" ").append(query.orderMode.get(i)).append(",");
}
if(query.orderColumn.size()>0){
sql.append(query.orderColumn.get(query.orderColumn.size()-1))
.append(" ").append(query.orderMode.get(query.orderColumn.size()-1));
log.append(query.orderColumn.get(query.orderColumn.size()-1))
.append(" ").append(query.orderMode.get(query.orderColumn.size()-1));
} logger.debug(new StringBuffer("DAOImpl searchSQL:").append(log).toString());
//ִ��SQL
Session session = getSessionFactory().openSession();
List result = null;
try {
Query hquery = session.createQuery(sql.toString());
for (int i = 0; i < valus.size(); i++) {
if(valus.get(i)==null){
throw new Exception("Common_QureyValueIsNull");
}
hquery.setParameter(i,valus.get(i));
}
if(!query.countMode && query.pageMode){
hquery.setFirstResult(query.firstResult);
hquery.setMaxResults(query.maxResult);
}
result = hquery.list();
session.close();
}catch (Exception ex) {
try {
session.close();
}catch (Exception e) {
e.printStackTrace();
}
throw ex;
}
if(query.countMode){
return result;
}
return result;
} public void genBeanExpressionSQL(BeanExpression e,String alias) throws Exception{
return;
} protected StringBuffer genWhereSQL(Entity aDO,boolean Exact)throws Exception{
return null;
} public Date getDBTime() throws Exception{
Session session = getSessionFactory().openSession();
Date now = null;
PreparedStatement stat = null;
ResultSet rs = null;
try {
Connection conn = session.connection();
String sql = "SELECT SYSDATE FROM DUAL";
logger.debug("DAOImpl genSearchSQL:" + sql);
stat = conn.prepareStatement( sql );
rs = stat.executeQuery();
rs.next();
now = rs.getTimestamp(1);
rs.close();
stat.close();
session.close();
}catch (Exception ex) {
try {
if(rs!=null){
rs.close();
}
}catch (Exception e) {
logger.error(e.getMessage());
}
try {
if(stat!=null){
stat.close();
}
}catch (Exception e) {
logger.error(e.getMessage());
}
try {
session.close();
}catch (Exception e) {
logger.error(e.getMessage());
}
throw ex;
}
return now;
}这段代码中searchByQuery是怎么用的?
from Member其实这里的Member是Member.class类,而非直接查询数据库中的Member表
因为hibernate在初始化的时候会把Member表映射到Member类中我们用的hql是在类中查询内容,明白了吗?